Mar 20 22:03:21.049599 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 20 19:36:47 -00 2025 Mar 20 22:03:21.049628 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 22:03:21.049639 kernel: BIOS-provided physical RAM map: Mar 20 22:03:21.049647 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 20 22:03:21.049655 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 20 22:03:21.049664 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 20 22:03:21.050742 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Mar 20 22:03:21.050751 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Mar 20 22:03:21.050759 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 20 22:03:21.050768 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 20 22:03:21.050776 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Mar 20 22:03:21.050784 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 20 22:03:21.050792 kernel: NX (Execute Disable) protection: active Mar 20 22:03:21.050800 kernel: APIC: Static calls initialized Mar 20 22:03:21.050815 kernel: SMBIOS 3.0.0 present. Mar 20 22:03:21.050824 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Mar 20 22:03:21.050833 kernel: Hypervisor detected: KVM Mar 20 22:03:21.050841 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 20 22:03:21.050850 kernel: kvm-clock: using sched offset of 3655301484 cycles Mar 20 22:03:21.050859 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 20 22:03:21.050870 kernel: tsc: Detected 1996.249 MHz processor Mar 20 22:03:21.050879 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 20 22:03:21.050889 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 20 22:03:21.050898 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Mar 20 22:03:21.050907 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 20 22:03:21.050916 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 20 22:03:21.050925 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Mar 20 22:03:21.050934 kernel: ACPI: Early table checksum verification disabled Mar 20 22:03:21.050945 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Mar 20 22:03:21.050954 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:03:21.050963 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:03:21.050972 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:03:21.050981 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Mar 20 22:03:21.050990 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:03:21.050998 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 22:03:21.051007 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Mar 20 22:03:21.051016 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Mar 20 22:03:21.051026 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Mar 20 22:03:21.051035 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Mar 20 22:03:21.051044 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Mar 20 22:03:21.051056 kernel: No NUMA configuration found Mar 20 22:03:21.051065 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Mar 20 22:03:21.051074 kernel: NODE_DATA(0) allocated [mem 0x13fffa000-0x13fffffff] Mar 20 22:03:21.051084 kernel: Zone ranges: Mar 20 22:03:21.051095 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 20 22:03:21.051105 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 20 22:03:21.051114 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Mar 20 22:03:21.051123 kernel: Movable zone start for each node Mar 20 22:03:21.051132 kernel: Early memory node ranges Mar 20 22:03:21.051141 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 20 22:03:21.051150 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Mar 20 22:03:21.051159 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Mar 20 22:03:21.051172 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Mar 20 22:03:21.051181 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 20 22:03:21.051191 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 20 22:03:21.051200 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Mar 20 22:03:21.051209 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 20 22:03:21.051218 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 20 22:03:21.051227 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 20 22:03:21.051237 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 20 22:03:21.051246 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 20 22:03:21.051257 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 20 22:03:21.051266 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 20 22:03:21.051275 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 20 22:03:21.051284 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 20 22:03:21.051293 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 20 22:03:21.051302 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 20 22:03:21.051311 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Mar 20 22:03:21.051320 kernel: Booting paravirtualized kernel on KVM Mar 20 22:03:21.051330 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 20 22:03:21.051341 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 20 22:03:21.051350 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 20 22:03:21.051360 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 20 22:03:21.051368 kernel: pcpu-alloc: [0] 0 1 Mar 20 22:03:21.051377 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 20 22:03:21.051388 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 22:03:21.051398 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 20 22:03:21.051407 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 20 22:03:21.051419 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 20 22:03:21.051428 kernel: Fallback order for Node 0: 0 Mar 20 22:03:21.051437 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 Mar 20 22:03:21.051446 kernel: Policy zone: Normal Mar 20 22:03:21.051455 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 20 22:03:21.051464 kernel: software IO TLB: area num 2. Mar 20 22:03:21.051474 kernel: Memory: 3962120K/4193772K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 231392K reserved, 0K cma-reserved) Mar 20 22:03:21.051483 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 20 22:03:21.051492 kernel: ftrace: allocating 37985 entries in 149 pages Mar 20 22:03:21.051504 kernel: ftrace: allocated 149 pages with 4 groups Mar 20 22:03:21.051513 kernel: Dynamic Preempt: voluntary Mar 20 22:03:21.051522 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 20 22:03:21.051532 kernel: rcu: RCU event tracing is enabled. Mar 20 22:03:21.051542 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 20 22:03:21.051551 kernel: Trampoline variant of Tasks RCU enabled. Mar 20 22:03:21.051560 kernel: Rude variant of Tasks RCU enabled. Mar 20 22:03:21.051569 kernel: Tracing variant of Tasks RCU enabled. Mar 20 22:03:21.051578 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 20 22:03:21.051589 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 20 22:03:21.051598 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 20 22:03:21.051607 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 20 22:03:21.051616 kernel: Console: colour VGA+ 80x25 Mar 20 22:03:21.051625 kernel: printk: console [tty0] enabled Mar 20 22:03:21.051634 kernel: printk: console [ttyS0] enabled Mar 20 22:03:21.051643 kernel: ACPI: Core revision 20230628 Mar 20 22:03:21.051653 kernel: APIC: Switch to symmetric I/O mode setup Mar 20 22:03:21.051662 kernel: x2apic enabled Mar 20 22:03:21.053171 kernel: APIC: Switched APIC routing to: physical x2apic Mar 20 22:03:21.053182 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 20 22:03:21.053191 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 20 22:03:21.053201 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Mar 20 22:03:21.053210 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 20 22:03:21.053219 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 20 22:03:21.053228 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 20 22:03:21.053238 kernel: Spectre V2 : Mitigation: Retpolines Mar 20 22:03:21.053247 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 20 22:03:21.053258 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 20 22:03:21.053267 kernel: Speculative Store Bypass: Vulnerable Mar 20 22:03:21.053277 kernel: x86/fpu: x87 FPU will use FXSAVE Mar 20 22:03:21.053286 kernel: Freeing SMP alternatives memory: 32K Mar 20 22:03:21.053302 kernel: pid_max: default: 32768 minimum: 301 Mar 20 22:03:21.053313 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 20 22:03:21.053323 kernel: landlock: Up and running. Mar 20 22:03:21.053332 kernel: SELinux: Initializing. Mar 20 22:03:21.053342 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 22:03:21.053351 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 22:03:21.053361 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Mar 20 22:03:21.053371 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 22:03:21.053383 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 22:03:21.053392 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 22:03:21.053402 kernel: Performance Events: AMD PMU driver. Mar 20 22:03:21.053412 kernel: ... version: 0 Mar 20 22:03:21.053421 kernel: ... bit width: 48 Mar 20 22:03:21.053432 kernel: ... generic registers: 4 Mar 20 22:03:21.053442 kernel: ... value mask: 0000ffffffffffff Mar 20 22:03:21.053452 kernel: ... max period: 00007fffffffffff Mar 20 22:03:21.053461 kernel: ... fixed-purpose events: 0 Mar 20 22:03:21.053471 kernel: ... event mask: 000000000000000f Mar 20 22:03:21.053480 kernel: signal: max sigframe size: 1440 Mar 20 22:03:21.053489 kernel: rcu: Hierarchical SRCU implementation. Mar 20 22:03:21.053499 kernel: rcu: Max phase no-delay instances is 400. Mar 20 22:03:21.053509 kernel: smp: Bringing up secondary CPUs ... Mar 20 22:03:21.053521 kernel: smpboot: x86: Booting SMP configuration: Mar 20 22:03:21.053530 kernel: .... node #0, CPUs: #1 Mar 20 22:03:21.053540 kernel: smp: Brought up 1 node, 2 CPUs Mar 20 22:03:21.053549 kernel: smpboot: Max logical packages: 2 Mar 20 22:03:21.053559 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Mar 20 22:03:21.053569 kernel: devtmpfs: initialized Mar 20 22:03:21.053578 kernel: x86/mm: Memory block size: 128MB Mar 20 22:03:21.053588 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 20 22:03:21.053597 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 20 22:03:21.053609 kernel: pinctrl core: initialized pinctrl subsystem Mar 20 22:03:21.053619 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 20 22:03:21.053628 kernel: audit: initializing netlink subsys (disabled) Mar 20 22:03:21.053638 kernel: audit: type=2000 audit(1742508200.150:1): state=initialized audit_enabled=0 res=1 Mar 20 22:03:21.053648 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 20 22:03:21.053657 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 20 22:03:21.054133 kernel: cpuidle: using governor menu Mar 20 22:03:21.054148 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 20 22:03:21.054158 kernel: dca service started, version 1.12.1 Mar 20 22:03:21.054171 kernel: PCI: Using configuration type 1 for base access Mar 20 22:03:21.054181 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 20 22:03:21.054191 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 20 22:03:21.054201 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 20 22:03:21.054210 kernel: ACPI: Added _OSI(Module Device) Mar 20 22:03:21.054220 kernel: ACPI: Added _OSI(Processor Device) Mar 20 22:03:21.054229 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 20 22:03:21.054239 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 20 22:03:21.054249 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 20 22:03:21.054261 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 20 22:03:21.054270 kernel: ACPI: Interpreter enabled Mar 20 22:03:21.054280 kernel: ACPI: PM: (supports S0 S3 S5) Mar 20 22:03:21.054289 kernel: ACPI: Using IOAPIC for interrupt routing Mar 20 22:03:21.054299 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 20 22:03:21.054309 kernel: PCI: Using E820 reservations for host bridge windows Mar 20 22:03:21.054318 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Mar 20 22:03:21.054328 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 20 22:03:21.054542 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 20 22:03:21.054653 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Mar 20 22:03:21.055464 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Mar 20 22:03:21.055482 kernel: acpiphp: Slot [3] registered Mar 20 22:03:21.055491 kernel: acpiphp: Slot [4] registered Mar 20 22:03:21.055500 kernel: acpiphp: Slot [5] registered Mar 20 22:03:21.055509 kernel: acpiphp: Slot [6] registered Mar 20 22:03:21.055519 kernel: acpiphp: Slot [7] registered Mar 20 22:03:21.055532 kernel: acpiphp: Slot [8] registered Mar 20 22:03:21.055541 kernel: acpiphp: Slot [9] registered Mar 20 22:03:21.055550 kernel: acpiphp: Slot [10] registered Mar 20 22:03:21.055559 kernel: acpiphp: Slot [11] registered Mar 20 22:03:21.055568 kernel: acpiphp: Slot [12] registered Mar 20 22:03:21.055577 kernel: acpiphp: Slot [13] registered Mar 20 22:03:21.055586 kernel: acpiphp: Slot [14] registered Mar 20 22:03:21.055595 kernel: acpiphp: Slot [15] registered Mar 20 22:03:21.055604 kernel: acpiphp: Slot [16] registered Mar 20 22:03:21.055613 kernel: acpiphp: Slot [17] registered Mar 20 22:03:21.055624 kernel: acpiphp: Slot [18] registered Mar 20 22:03:21.055632 kernel: acpiphp: Slot [19] registered Mar 20 22:03:21.055641 kernel: acpiphp: Slot [20] registered Mar 20 22:03:21.055650 kernel: acpiphp: Slot [21] registered Mar 20 22:03:21.055659 kernel: acpiphp: Slot [22] registered Mar 20 22:03:21.055695 kernel: acpiphp: Slot [23] registered Mar 20 22:03:21.055705 kernel: acpiphp: Slot [24] registered Mar 20 22:03:21.055715 kernel: acpiphp: Slot [25] registered Mar 20 22:03:21.055723 kernel: acpiphp: Slot [26] registered Mar 20 22:03:21.055735 kernel: acpiphp: Slot [27] registered Mar 20 22:03:21.055744 kernel: acpiphp: Slot [28] registered Mar 20 22:03:21.055753 kernel: acpiphp: Slot [29] registered Mar 20 22:03:21.055762 kernel: acpiphp: Slot [30] registered Mar 20 22:03:21.055771 kernel: acpiphp: Slot [31] registered Mar 20 22:03:21.055780 kernel: PCI host bridge to bus 0000:00 Mar 20 22:03:21.055905 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 20 22:03:21.056003 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 20 22:03:21.056096 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 20 22:03:21.056192 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 20 22:03:21.056284 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Mar 20 22:03:21.056375 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 20 22:03:21.056531 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Mar 20 22:03:21.056646 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Mar 20 22:03:21.057823 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Mar 20 22:03:21.057938 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Mar 20 22:03:21.058040 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 20 22:03:21.058140 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 20 22:03:21.058240 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 20 22:03:21.058340 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 20 22:03:21.058451 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Mar 20 22:03:21.058557 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Mar 20 22:03:21.058656 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Mar 20 22:03:21.060806 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Mar 20 22:03:21.060919 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Mar 20 22:03:21.061021 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 20 22:03:21.061120 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Mar 20 22:03:21.061220 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Mar 20 22:03:21.061325 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 20 22:03:21.061437 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 20 22:03:21.061538 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Mar 20 22:03:21.061637 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Mar 20 22:03:21.062565 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] Mar 20 22:03:21.062733 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Mar 20 22:03:21.062848 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 20 22:03:21.062956 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 20 22:03:21.063055 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Mar 20 22:03:21.063152 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] Mar 20 22:03:21.063264 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Mar 20 22:03:21.063369 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Mar 20 22:03:21.063467 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] Mar 20 22:03:21.063575 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Mar 20 22:03:21.063700 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Mar 20 22:03:21.063803 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] Mar 20 22:03:21.063916 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] Mar 20 22:03:21.063931 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 20 22:03:21.063941 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 20 22:03:21.063951 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 20 22:03:21.063961 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 20 22:03:21.063971 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 20 22:03:21.063986 kernel: iommu: Default domain type: Translated Mar 20 22:03:21.063996 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 20 22:03:21.064005 kernel: PCI: Using ACPI for IRQ routing Mar 20 22:03:21.064015 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 20 22:03:21.064025 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 20 22:03:21.064034 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Mar 20 22:03:21.064133 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Mar 20 22:03:21.064232 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Mar 20 22:03:21.064332 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 20 22:03:21.064350 kernel: vgaarb: loaded Mar 20 22:03:21.064360 kernel: clocksource: Switched to clocksource kvm-clock Mar 20 22:03:21.064370 kernel: VFS: Disk quotas dquot_6.6.0 Mar 20 22:03:21.064379 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 20 22:03:21.064389 kernel: pnp: PnP ACPI init Mar 20 22:03:21.064500 kernel: pnp 00:03: [dma 2] Mar 20 22:03:21.064516 kernel: pnp: PnP ACPI: found 5 devices Mar 20 22:03:21.064526 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 20 22:03:21.064540 kernel: NET: Registered PF_INET protocol family Mar 20 22:03:21.064549 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 20 22:03:21.064559 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 20 22:03:21.064569 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 20 22:03:21.064579 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 20 22:03:21.064589 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 20 22:03:21.064600 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 20 22:03:21.064609 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 22:03:21.064619 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 22:03:21.064631 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 20 22:03:21.064640 kernel: NET: Registered PF_XDP protocol family Mar 20 22:03:21.064754 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 20 22:03:21.064845 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 20 22:03:21.064932 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 20 22:03:21.065020 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Mar 20 22:03:21.065107 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Mar 20 22:03:21.065211 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Mar 20 22:03:21.065319 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 20 22:03:21.065334 kernel: PCI: CLS 0 bytes, default 64 Mar 20 22:03:21.065344 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 20 22:03:21.065354 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) Mar 20 22:03:21.065364 kernel: Initialise system trusted keyrings Mar 20 22:03:21.065374 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 20 22:03:21.065384 kernel: Key type asymmetric registered Mar 20 22:03:21.065394 kernel: Asymmetric key parser 'x509' registered Mar 20 22:03:21.065403 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 20 22:03:21.065417 kernel: io scheduler mq-deadline registered Mar 20 22:03:21.065427 kernel: io scheduler kyber registered Mar 20 22:03:21.065436 kernel: io scheduler bfq registered Mar 20 22:03:21.065446 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 20 22:03:21.065457 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Mar 20 22:03:21.065467 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 20 22:03:21.065476 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Mar 20 22:03:21.065486 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 20 22:03:21.065496 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 20 22:03:21.065508 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 20 22:03:21.065518 kernel: random: crng init done Mar 20 22:03:21.065528 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 20 22:03:21.065537 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 20 22:03:21.065547 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 20 22:03:21.065704 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 20 22:03:21.065723 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 20 22:03:21.065818 kernel: rtc_cmos 00:04: registered as rtc0 Mar 20 22:03:21.065916 kernel: rtc_cmos 00:04: setting system clock to 2025-03-20T22:03:20 UTC (1742508200) Mar 20 22:03:21.066010 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 20 22:03:21.066024 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 20 22:03:21.066035 kernel: NET: Registered PF_INET6 protocol family Mar 20 22:03:21.066045 kernel: Segment Routing with IPv6 Mar 20 22:03:21.066054 kernel: In-situ OAM (IOAM) with IPv6 Mar 20 22:03:21.066064 kernel: NET: Registered PF_PACKET protocol family Mar 20 22:03:21.066074 kernel: Key type dns_resolver registered Mar 20 22:03:21.066083 kernel: IPI shorthand broadcast: enabled Mar 20 22:03:21.066097 kernel: sched_clock: Marking stable (1044008034, 171174430)->(1251527800, -36345336) Mar 20 22:03:21.066106 kernel: registered taskstats version 1 Mar 20 22:03:21.066116 kernel: Loading compiled-in X.509 certificates Mar 20 22:03:21.066126 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 9e7923b67df1c6f0613bc4380f7ea8de9ce851ac' Mar 20 22:03:21.066136 kernel: Key type .fscrypt registered Mar 20 22:03:21.066145 kernel: Key type fscrypt-provisioning registered Mar 20 22:03:21.066155 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 20 22:03:21.066165 kernel: ima: Allocated hash algorithm: sha1 Mar 20 22:03:21.066176 kernel: ima: No architecture policies found Mar 20 22:03:21.066186 kernel: clk: Disabling unused clocks Mar 20 22:03:21.066196 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 20 22:03:21.066205 kernel: Write protecting the kernel read-only data: 40960k Mar 20 22:03:21.066215 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 20 22:03:21.066225 kernel: Run /init as init process Mar 20 22:03:21.066234 kernel: with arguments: Mar 20 22:03:21.066244 kernel: /init Mar 20 22:03:21.066253 kernel: with environment: Mar 20 22:03:21.066263 kernel: HOME=/ Mar 20 22:03:21.066274 kernel: TERM=linux Mar 20 22:03:21.066284 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 20 22:03:21.066295 systemd[1]: Successfully made /usr/ read-only. Mar 20 22:03:21.066309 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 22:03:21.066320 systemd[1]: Detected virtualization kvm. Mar 20 22:03:21.066331 systemd[1]: Detected architecture x86-64. Mar 20 22:03:21.066341 systemd[1]: Running in initrd. Mar 20 22:03:21.066354 systemd[1]: No hostname configured, using default hostname. Mar 20 22:03:21.066364 systemd[1]: Hostname set to . Mar 20 22:03:21.066375 systemd[1]: Initializing machine ID from VM UUID. Mar 20 22:03:21.066385 systemd[1]: Queued start job for default target initrd.target. Mar 20 22:03:21.066396 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 22:03:21.066406 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 22:03:21.066427 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 20 22:03:21.066439 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 22:03:21.066450 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 20 22:03:21.066464 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 20 22:03:21.066476 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 20 22:03:21.066486 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 20 22:03:21.066499 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 22:03:21.066510 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 22:03:21.066521 systemd[1]: Reached target paths.target - Path Units. Mar 20 22:03:21.066532 systemd[1]: Reached target slices.target - Slice Units. Mar 20 22:03:21.066542 systemd[1]: Reached target swap.target - Swaps. Mar 20 22:03:21.066553 systemd[1]: Reached target timers.target - Timer Units. Mar 20 22:03:21.066563 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 22:03:21.066574 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 22:03:21.066585 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 20 22:03:21.066598 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 20 22:03:21.066609 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 22:03:21.066619 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 22:03:21.066630 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 22:03:21.066641 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 22:03:21.066652 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 20 22:03:21.066663 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 22:03:21.066691 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 20 22:03:21.066705 systemd[1]: Starting systemd-fsck-usr.service... Mar 20 22:03:21.066715 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 22:03:21.066726 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 22:03:21.066737 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:03:21.066747 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 20 22:03:21.066799 systemd-journald[183]: Collecting audit messages is disabled. Mar 20 22:03:21.066831 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 22:03:21.066848 systemd[1]: Finished systemd-fsck-usr.service. Mar 20 22:03:21.066859 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 22:03:21.066871 systemd-journald[183]: Journal started Mar 20 22:03:21.066896 systemd-journald[183]: Runtime Journal (/run/log/journal/5f8fd856ef814d98b596fabf9c6602f9) is 8M, max 78.2M, 70.2M free. Mar 20 22:03:21.077689 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 22:03:21.081239 systemd-modules-load[184]: Inserted module 'overlay' Mar 20 22:03:21.125286 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 20 22:03:21.125309 kernel: Bridge firewalling registered Mar 20 22:03:21.111684 systemd-modules-load[184]: Inserted module 'br_netfilter' Mar 20 22:03:21.126233 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 22:03:21.127917 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:03:21.129049 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 22:03:21.134143 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 22:03:21.136816 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 22:03:21.139788 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 22:03:21.147807 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 22:03:21.155889 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 22:03:21.160251 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 22:03:21.166387 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:03:21.168806 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 20 22:03:21.171601 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 22:03:21.192800 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 22:03:21.204072 dracut-cmdline[218]: dracut-dracut-053 Mar 20 22:03:21.207731 dracut-cmdline[218]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=619bfa043b53ac975036e415994a80721794ae8277072d0a93c174b4f7768019 Mar 20 22:03:21.237913 systemd-resolved[220]: Positive Trust Anchors: Mar 20 22:03:21.238341 systemd-resolved[220]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 22:03:21.238384 systemd-resolved[220]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 22:03:21.245536 systemd-resolved[220]: Defaulting to hostname 'linux'. Mar 20 22:03:21.246739 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 22:03:21.247884 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 22:03:21.299797 kernel: SCSI subsystem initialized Mar 20 22:03:21.311732 kernel: Loading iSCSI transport class v2.0-870. Mar 20 22:03:21.323721 kernel: iscsi: registered transport (tcp) Mar 20 22:03:21.346778 kernel: iscsi: registered transport (qla4xxx) Mar 20 22:03:21.346858 kernel: QLogic iSCSI HBA Driver Mar 20 22:03:21.405720 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 20 22:03:21.410655 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 20 22:03:21.467897 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 20 22:03:21.468021 kernel: device-mapper: uevent: version 1.0.3 Mar 20 22:03:21.469984 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 20 22:03:21.530731 kernel: raid6: sse2x4 gen() 5201 MB/s Mar 20 22:03:21.549720 kernel: raid6: sse2x2 gen() 5966 MB/s Mar 20 22:03:21.568076 kernel: raid6: sse2x1 gen() 8837 MB/s Mar 20 22:03:21.568147 kernel: raid6: using algorithm sse2x1 gen() 8837 MB/s Mar 20 22:03:21.587201 kernel: raid6: .... xor() 7357 MB/s, rmw enabled Mar 20 22:03:21.587262 kernel: raid6: using ssse3x2 recovery algorithm Mar 20 22:03:21.610474 kernel: xor: measuring software checksum speed Mar 20 22:03:21.610537 kernel: prefetch64-sse : 18491 MB/sec Mar 20 22:03:21.610965 kernel: generic_sse : 16846 MB/sec Mar 20 22:03:21.613105 kernel: xor: using function: prefetch64-sse (18491 MB/sec) Mar 20 22:03:21.786733 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 20 22:03:21.803455 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 20 22:03:21.808550 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 22:03:21.835873 systemd-udevd[403]: Using default interface naming scheme 'v255'. Mar 20 22:03:21.840938 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 22:03:21.848895 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 20 22:03:21.876103 dracut-pre-trigger[413]: rd.md=0: removing MD RAID activation Mar 20 22:03:21.920855 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 22:03:21.925936 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 22:03:22.007229 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 22:03:22.012393 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 20 22:03:22.057051 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 20 22:03:22.060526 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 22:03:22.062640 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 22:03:22.066205 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 22:03:22.073643 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 20 22:03:22.091687 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Mar 20 22:03:22.142035 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Mar 20 22:03:22.142167 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 20 22:03:22.142183 kernel: GPT:17805311 != 20971519 Mar 20 22:03:22.142195 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 20 22:03:22.142207 kernel: GPT:17805311 != 20971519 Mar 20 22:03:22.142218 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 20 22:03:22.142229 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 22:03:22.100955 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 20 22:03:22.143353 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 22:03:22.146786 kernel: libata version 3.00 loaded. Mar 20 22:03:22.143490 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:03:22.146304 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 22:03:22.151205 kernel: ata_piix 0000:00:01.1: version 2.13 Mar 20 22:03:22.162969 kernel: scsi host0: ata_piix Mar 20 22:03:22.163117 kernel: scsi host1: ata_piix Mar 20 22:03:22.163231 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Mar 20 22:03:22.163245 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Mar 20 22:03:22.148710 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 22:03:22.148867 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:03:22.149913 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:03:22.151581 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:03:22.153834 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 20 22:03:22.189703 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (451) Mar 20 22:03:22.192695 kernel: BTRFS: device fsid 48a514e8-9ecc-46c2-935b-caca347f921e devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (460) Mar 20 22:03:22.224796 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 20 22:03:22.244719 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 20 22:03:22.245497 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:03:22.257508 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 22:03:22.266423 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 20 22:03:22.266998 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 20 22:03:22.269851 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 20 22:03:22.274777 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 22:03:22.293616 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:03:22.300517 disk-uuid[504]: Primary Header is updated. Mar 20 22:03:22.300517 disk-uuid[504]: Secondary Entries is updated. Mar 20 22:03:22.300517 disk-uuid[504]: Secondary Header is updated. Mar 20 22:03:22.302137 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 22:03:23.321265 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 22:03:23.322970 disk-uuid[513]: The operation has completed successfully. Mar 20 22:03:23.413469 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 20 22:03:23.413592 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 20 22:03:23.452197 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 20 22:03:23.472183 sh[524]: Success Mar 20 22:03:23.494826 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Mar 20 22:03:23.582044 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 20 22:03:23.589846 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 20 22:03:23.604618 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 20 22:03:23.631739 kernel: BTRFS info (device dm-0): first mount of filesystem 48a514e8-9ecc-46c2-935b-caca347f921e Mar 20 22:03:23.631869 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:03:23.631908 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 20 22:03:23.634864 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 20 22:03:23.639464 kernel: BTRFS info (device dm-0): using free space tree Mar 20 22:03:23.654423 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 20 22:03:23.655446 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 20 22:03:23.657782 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 20 22:03:23.659534 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 20 22:03:23.685975 kernel: BTRFS info (device vda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:03:23.686036 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:03:23.686049 kernel: BTRFS info (device vda6): using free space tree Mar 20 22:03:23.693707 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 22:03:23.699719 kernel: BTRFS info (device vda6): last unmount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:03:23.713790 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 20 22:03:23.715797 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 20 22:03:23.788225 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 22:03:23.791562 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 22:03:23.828176 systemd-networkd[704]: lo: Link UP Mar 20 22:03:23.828184 systemd-networkd[704]: lo: Gained carrier Mar 20 22:03:23.830588 systemd-networkd[704]: Enumeration completed Mar 20 22:03:23.832041 systemd-networkd[704]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:03:23.832048 systemd-networkd[704]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 22:03:23.837154 systemd-networkd[704]: eth0: Link UP Mar 20 22:03:23.837158 systemd-networkd[704]: eth0: Gained carrier Mar 20 22:03:23.837166 systemd-networkd[704]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:03:23.840021 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 22:03:23.843507 systemd[1]: Reached target network.target - Network. Mar 20 22:03:23.847711 systemd-networkd[704]: eth0: DHCPv4 address 172.24.4.166/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 20 22:03:23.871812 ignition[630]: Ignition 2.20.0 Mar 20 22:03:23.872590 ignition[630]: Stage: fetch-offline Mar 20 22:03:23.872634 ignition[630]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:23.872645 ignition[630]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:23.874543 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 22:03:23.872778 ignition[630]: parsed url from cmdline: "" Mar 20 22:03:23.872782 ignition[630]: no config URL provided Mar 20 22:03:23.872788 ignition[630]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 22:03:23.872797 ignition[630]: no config at "/usr/lib/ignition/user.ign" Mar 20 22:03:23.872801 ignition[630]: failed to fetch config: resource requires networking Mar 20 22:03:23.872984 ignition[630]: Ignition finished successfully Mar 20 22:03:23.878774 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 20 22:03:23.898925 ignition[713]: Ignition 2.20.0 Mar 20 22:03:23.898937 ignition[713]: Stage: fetch Mar 20 22:03:23.899108 ignition[713]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:23.899121 ignition[713]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:23.899204 ignition[713]: parsed url from cmdline: "" Mar 20 22:03:23.899208 ignition[713]: no config URL provided Mar 20 22:03:23.899213 ignition[713]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 22:03:23.899222 ignition[713]: no config at "/usr/lib/ignition/user.ign" Mar 20 22:03:23.899336 ignition[713]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 20 22:03:23.899423 ignition[713]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 20 22:03:23.899453 ignition[713]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 20 22:03:24.187093 systemd-resolved[220]: Detected conflict on linux IN A 172.24.4.166 Mar 20 22:03:24.187132 systemd-resolved[220]: Hostname conflict, changing published hostname from 'linux' to 'linux6'. Mar 20 22:03:24.264302 ignition[713]: GET result: OK Mar 20 22:03:24.265758 ignition[713]: parsing config with SHA512: 5786c05f45ef195550814bb7821b80ad9ba8451ce21464bbe9796771f90e93610f946ccb3f36a742e691e3ef44aaf9760a1d2032abe8f17fbf991acc8dd2a789 Mar 20 22:03:24.276977 unknown[713]: fetched base config from "system" Mar 20 22:03:24.277003 unknown[713]: fetched base config from "system" Mar 20 22:03:24.277910 ignition[713]: fetch: fetch complete Mar 20 22:03:24.277022 unknown[713]: fetched user config from "openstack" Mar 20 22:03:24.277923 ignition[713]: fetch: fetch passed Mar 20 22:03:24.281579 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 20 22:03:24.278012 ignition[713]: Ignition finished successfully Mar 20 22:03:24.286916 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 20 22:03:24.335634 ignition[719]: Ignition 2.20.0 Mar 20 22:03:24.335662 ignition[719]: Stage: kargs Mar 20 22:03:24.336120 ignition[719]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:24.340573 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 20 22:03:24.336147 ignition[719]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:24.338448 ignition[719]: kargs: kargs passed Mar 20 22:03:24.338547 ignition[719]: Ignition finished successfully Mar 20 22:03:24.345981 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 20 22:03:24.381063 ignition[725]: Ignition 2.20.0 Mar 20 22:03:24.381095 ignition[725]: Stage: disks Mar 20 22:03:24.381595 ignition[725]: no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:24.381634 ignition[725]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:24.389457 ignition[725]: disks: disks passed Mar 20 22:03:24.389562 ignition[725]: Ignition finished successfully Mar 20 22:03:24.391522 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 20 22:03:24.394554 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 20 22:03:24.396509 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 20 22:03:24.399535 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 22:03:24.402448 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 22:03:24.404952 systemd[1]: Reached target basic.target - Basic System. Mar 20 22:03:24.409715 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 20 22:03:24.460096 systemd-fsck[733]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 20 22:03:24.473335 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 20 22:03:24.478857 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 20 22:03:24.654743 kernel: EXT4-fs (vda9): mounted filesystem 79cdbe74-6884-4c57-b04d-c9a431509f16 r/w with ordered data mode. Quota mode: none. Mar 20 22:03:24.655938 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 20 22:03:24.657417 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 20 22:03:24.660570 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 22:03:24.662750 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 20 22:03:24.663437 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 20 22:03:24.668280 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 20 22:03:24.669308 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 20 22:03:24.669341 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 22:03:24.678836 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 20 22:03:24.682791 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 20 22:03:24.695690 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (741) Mar 20 22:03:24.698684 kernel: BTRFS info (device vda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:03:24.702019 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:03:24.702042 kernel: BTRFS info (device vda6): using free space tree Mar 20 22:03:24.711700 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 22:03:24.721748 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 22:03:24.821060 initrd-setup-root[771]: cut: /sysroot/etc/passwd: No such file or directory Mar 20 22:03:24.827998 initrd-setup-root[778]: cut: /sysroot/etc/group: No such file or directory Mar 20 22:03:24.835031 initrd-setup-root[785]: cut: /sysroot/etc/shadow: No such file or directory Mar 20 22:03:24.841203 initrd-setup-root[792]: cut: /sysroot/etc/gshadow: No such file or directory Mar 20 22:03:24.942333 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 20 22:03:24.946931 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 20 22:03:24.952021 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 20 22:03:24.981593 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 20 22:03:24.987901 kernel: BTRFS info (device vda6): last unmount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:03:25.014576 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 20 22:03:25.022800 systemd-networkd[704]: eth0: Gained IPv6LL Mar 20 22:03:25.024066 ignition[861]: INFO : Ignition 2.20.0 Mar 20 22:03:25.025034 ignition[861]: INFO : Stage: mount Mar 20 22:03:25.025655 ignition[861]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:25.025655 ignition[861]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:25.028359 ignition[861]: INFO : mount: mount passed Mar 20 22:03:25.028359 ignition[861]: INFO : Ignition finished successfully Mar 20 22:03:25.029819 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 20 22:03:31.873099 coreos-metadata[743]: Mar 20 22:03:31.873 WARN failed to locate config-drive, using the metadata service API instead Mar 20 22:03:31.916950 coreos-metadata[743]: Mar 20 22:03:31.916 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 20 22:03:31.932863 coreos-metadata[743]: Mar 20 22:03:31.932 INFO Fetch successful Mar 20 22:03:31.934382 coreos-metadata[743]: Mar 20 22:03:31.933 INFO wrote hostname ci-9999-0-2-f-52bc1ad8d1.novalocal to /sysroot/etc/hostname Mar 20 22:03:31.936723 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 20 22:03:31.936946 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 20 22:03:31.944544 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 20 22:03:31.974493 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 22:03:32.006772 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (877) Mar 20 22:03:32.014802 kernel: BTRFS info (device vda6): first mount of filesystem c415ef49-5595-4a0b-ba48-8f3e642f303e Mar 20 22:03:32.014885 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 22:03:32.018980 kernel: BTRFS info (device vda6): using free space tree Mar 20 22:03:32.029706 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 22:03:32.035408 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 22:03:32.082556 ignition[895]: INFO : Ignition 2.20.0 Mar 20 22:03:32.082556 ignition[895]: INFO : Stage: files Mar 20 22:03:32.085621 ignition[895]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:32.085621 ignition[895]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:32.085621 ignition[895]: DEBUG : files: compiled without relabeling support, skipping Mar 20 22:03:32.091127 ignition[895]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 20 22:03:32.091127 ignition[895]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 20 22:03:32.095193 ignition[895]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 20 22:03:32.095193 ignition[895]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 20 22:03:32.095193 ignition[895]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 20 22:03:32.093966 unknown[895]: wrote ssh authorized keys file for user: core Mar 20 22:03:32.102645 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 22:03:32.102645 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 20 22:03:33.897068 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 20 22:03:39.109478 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 22:03:39.111290 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 22:03:39.119287 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 22:03:39.119287 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 22:03:39.119287 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 22:03:39.119287 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 22:03:39.119287 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Mar 20 22:03:39.765587 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 20 22:03:41.299355 ignition[895]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 20 22:03:41.299355 ignition[895]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 20 22:03:41.323902 ignition[895]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 22:03:41.323902 ignition[895]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 22:03:41.323902 ignition[895]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 20 22:03:41.323902 ignition[895]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 20 22:03:41.337914 ignition[895]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 20 22:03:41.337914 ignition[895]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 20 22:03:41.337914 ignition[895]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 20 22:03:41.337914 ignition[895]: INFO : files: files passed Mar 20 22:03:41.337914 ignition[895]: INFO : Ignition finished successfully Mar 20 22:03:41.325991 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 20 22:03:41.333015 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 20 22:03:41.335824 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 20 22:03:41.362104 initrd-setup-root-after-ignition[923]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 22:03:41.362104 initrd-setup-root-after-ignition[923]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 20 22:03:41.368493 initrd-setup-root-after-ignition[928]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 22:03:41.367213 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 22:03:41.369218 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 20 22:03:41.372788 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 20 22:03:41.383338 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 20 22:03:41.383551 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 20 22:03:41.431355 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 20 22:03:41.431549 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 20 22:03:41.434144 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 20 22:03:41.435015 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 20 22:03:41.437820 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 20 22:03:41.440784 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 20 22:03:41.467338 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 22:03:41.471996 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 20 22:03:41.503805 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 20 22:03:41.505463 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 22:03:41.508586 systemd[1]: Stopped target timers.target - Timer Units. Mar 20 22:03:41.511383 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 20 22:03:41.511700 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 22:03:41.514833 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 20 22:03:41.516848 systemd[1]: Stopped target basic.target - Basic System. Mar 20 22:03:41.519596 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 20 22:03:41.522159 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 22:03:41.524539 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 20 22:03:41.527470 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 20 22:03:41.530372 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 22:03:41.533373 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 20 22:03:41.536191 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 20 22:03:41.539195 systemd[1]: Stopped target swap.target - Swaps. Mar 20 22:03:41.541784 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 20 22:03:41.542078 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 20 22:03:41.545139 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 20 22:03:41.547057 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 22:03:41.549409 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 20 22:03:41.550130 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 22:03:41.552551 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 20 22:03:41.552878 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 20 22:03:41.556834 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 20 22:03:41.557148 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 22:03:41.558858 systemd[1]: ignition-files.service: Deactivated successfully. Mar 20 22:03:41.559128 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 20 22:03:41.565108 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 20 22:03:41.570891 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 20 22:03:41.575501 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 20 22:03:41.575998 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 22:03:41.579645 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 20 22:03:41.579851 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 22:03:41.589085 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 20 22:03:41.589174 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 20 22:03:41.600144 ignition[949]: INFO : Ignition 2.20.0 Mar 20 22:03:41.600144 ignition[949]: INFO : Stage: umount Mar 20 22:03:41.600144 ignition[949]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 22:03:41.600144 ignition[949]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 22:03:41.602855 ignition[949]: INFO : umount: umount passed Mar 20 22:03:41.602855 ignition[949]: INFO : Ignition finished successfully Mar 20 22:03:41.601126 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 20 22:03:41.602723 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 20 22:03:41.604542 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 20 22:03:41.604614 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 20 22:03:41.606264 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 20 22:03:41.606307 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 20 22:03:41.607217 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 20 22:03:41.607260 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 20 22:03:41.608780 systemd[1]: Stopped target network.target - Network. Mar 20 22:03:41.609991 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 20 22:03:41.610040 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 22:03:41.612490 systemd[1]: Stopped target paths.target - Path Units. Mar 20 22:03:41.613406 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 20 22:03:41.615786 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 22:03:41.616467 systemd[1]: Stopped target slices.target - Slice Units. Mar 20 22:03:41.617398 systemd[1]: Stopped target sockets.target - Socket Units. Mar 20 22:03:41.618506 systemd[1]: iscsid.socket: Deactivated successfully. Mar 20 22:03:41.618542 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 22:03:41.619993 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 20 22:03:41.620024 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 22:03:41.622482 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 20 22:03:41.622524 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 20 22:03:41.623434 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 20 22:03:41.623475 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 20 22:03:41.624493 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 20 22:03:41.626765 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 20 22:03:41.629431 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 20 22:03:41.631700 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 20 22:03:41.633527 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 20 22:03:41.638051 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 20 22:03:41.638310 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 20 22:03:41.638399 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 20 22:03:41.640460 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 20 22:03:41.640653 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 20 22:03:41.640781 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 20 22:03:41.643081 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 20 22:03:41.643140 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 20 22:03:41.643783 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 20 22:03:41.643855 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 20 22:03:41.645743 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 20 22:03:41.650326 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 20 22:03:41.650383 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 22:03:41.651600 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 20 22:03:41.651642 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 20 22:03:41.652648 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 20 22:03:41.652710 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 20 22:03:41.653643 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 20 22:03:41.653705 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 22:03:41.655121 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 22:03:41.657467 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 20 22:03:41.657525 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 20 22:03:41.666031 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 20 22:03:41.666185 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 22:03:41.667073 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 20 22:03:41.667111 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 20 22:03:41.668020 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 20 22:03:41.668052 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 22:03:41.669181 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 20 22:03:41.669223 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 20 22:03:41.670811 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 20 22:03:41.670852 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 20 22:03:41.671944 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 22:03:41.671984 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 22:03:41.675786 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 20 22:03:41.676852 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 20 22:03:41.676901 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 22:03:41.678119 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 20 22:03:41.678162 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 22:03:41.679830 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 20 22:03:41.679872 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 22:03:41.680998 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 22:03:41.681038 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:03:41.683526 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 20 22:03:41.683583 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 20 22:03:41.686817 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 20 22:03:41.686899 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 20 22:03:41.690427 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 20 22:03:41.690527 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 20 22:03:41.692269 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 20 22:03:41.693924 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 20 22:03:41.710963 systemd[1]: Switching root. Mar 20 22:03:41.740759 systemd-journald[183]: Journal stopped Mar 20 22:03:43.559155 systemd-journald[183]: Received SIGTERM from PID 1 (systemd). Mar 20 22:03:43.559210 kernel: SELinux: policy capability network_peer_controls=1 Mar 20 22:03:43.559228 kernel: SELinux: policy capability open_perms=1 Mar 20 22:03:43.559239 kernel: SELinux: policy capability extended_socket_class=1 Mar 20 22:03:43.559251 kernel: SELinux: policy capability always_check_network=0 Mar 20 22:03:43.559265 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 20 22:03:43.559277 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 20 22:03:43.559290 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 20 22:03:43.559301 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 20 22:03:43.559313 kernel: audit: type=1403 audit(1742508222.467:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 20 22:03:43.559330 systemd[1]: Successfully loaded SELinux policy in 80.787ms. Mar 20 22:03:43.559351 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 28.176ms. Mar 20 22:03:43.559365 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 22:03:43.559377 systemd[1]: Detected virtualization kvm. Mar 20 22:03:43.559392 systemd[1]: Detected architecture x86-64. Mar 20 22:03:43.559404 systemd[1]: Detected first boot. Mar 20 22:03:43.559416 systemd[1]: Hostname set to . Mar 20 22:03:43.559428 systemd[1]: Initializing machine ID from VM UUID. Mar 20 22:03:43.559440 zram_generator::config[996]: No configuration found. Mar 20 22:03:43.559453 kernel: Guest personality initialized and is inactive Mar 20 22:03:43.559464 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 20 22:03:43.559477 kernel: Initialized host personality Mar 20 22:03:43.559488 kernel: NET: Registered PF_VSOCK protocol family Mar 20 22:03:43.559499 systemd[1]: Populated /etc with preset unit settings. Mar 20 22:03:43.559512 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 20 22:03:43.559524 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 20 22:03:43.559536 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 20 22:03:43.560496 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 20 22:03:43.560514 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 20 22:03:43.560527 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 20 22:03:43.560544 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 20 22:03:43.560556 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 20 22:03:43.560569 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 20 22:03:43.560581 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 20 22:03:43.560593 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 20 22:03:43.560605 systemd[1]: Created slice user.slice - User and Session Slice. Mar 20 22:03:43.560618 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 22:03:43.560630 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 22:03:43.560642 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 20 22:03:43.560657 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 20 22:03:43.560689 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 20 22:03:43.560703 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 22:03:43.560715 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 20 22:03:43.560727 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 22:03:43.560739 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 20 22:03:43.560757 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 20 22:03:43.560769 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 20 22:03:43.560781 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 20 22:03:43.560793 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 22:03:43.560806 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 22:03:43.560818 systemd[1]: Reached target slices.target - Slice Units. Mar 20 22:03:43.560830 systemd[1]: Reached target swap.target - Swaps. Mar 20 22:03:43.560842 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 20 22:03:43.560853 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 20 22:03:43.560867 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 20 22:03:43.560883 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 22:03:43.560898 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 22:03:43.560909 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 22:03:43.560921 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 20 22:03:43.560933 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 20 22:03:43.560945 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 20 22:03:43.560958 systemd[1]: Mounting media.mount - External Media Directory... Mar 20 22:03:43.560971 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:03:43.560984 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 20 22:03:43.560997 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 20 22:03:43.561009 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 20 22:03:43.561023 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 20 22:03:43.561035 systemd[1]: Reached target machines.target - Containers. Mar 20 22:03:43.561047 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 20 22:03:43.561060 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 22:03:43.561072 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 22:03:43.561084 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 20 22:03:43.561098 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 22:03:43.561111 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 22:03:43.561123 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 22:03:43.561135 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 20 22:03:43.561147 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 22:03:43.561159 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 20 22:03:43.561171 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 20 22:03:43.561183 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 20 22:03:43.561197 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 20 22:03:43.561208 systemd[1]: Stopped systemd-fsck-usr.service. Mar 20 22:03:43.561221 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 22:03:43.561233 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 22:03:43.561244 kernel: fuse: init (API version 7.39) Mar 20 22:03:43.561256 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 22:03:43.561268 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 20 22:03:43.561281 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 20 22:03:43.561295 kernel: loop: module loaded Mar 20 22:03:43.561306 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 20 22:03:43.561318 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 22:03:43.561330 systemd[1]: verity-setup.service: Deactivated successfully. Mar 20 22:03:43.561342 systemd[1]: Stopped verity-setup.service. Mar 20 22:03:43.561357 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:03:43.561371 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 20 22:03:43.561383 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 20 22:03:43.561395 systemd[1]: Mounted media.mount - External Media Directory. Mar 20 22:03:43.561407 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 20 22:03:43.561419 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 20 22:03:43.561456 systemd-journald[1100]: Collecting audit messages is disabled. Mar 20 22:03:43.561484 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 20 22:03:43.561497 systemd-journald[1100]: Journal started Mar 20 22:03:43.561522 systemd-journald[1100]: Runtime Journal (/run/log/journal/5f8fd856ef814d98b596fabf9c6602f9) is 8M, max 78.2M, 70.2M free. Mar 20 22:03:43.205261 systemd[1]: Queued start job for default target multi-user.target. Mar 20 22:03:43.214143 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 20 22:03:43.214653 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 20 22:03:43.567694 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 22:03:43.566729 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 20 22:03:43.568014 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 22:03:43.568956 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 20 22:03:43.569106 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 20 22:03:43.572233 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 22:03:43.572393 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 22:03:43.573139 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 22:03:43.573295 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 22:03:43.574068 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 20 22:03:43.574220 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 20 22:03:43.574922 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 22:03:43.575065 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 22:03:43.577245 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 22:03:43.578046 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 20 22:03:43.580065 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 20 22:03:43.595511 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 20 22:03:43.600297 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 20 22:03:43.619661 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 20 22:03:43.621631 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 20 22:03:43.621685 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 22:03:43.623352 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 20 22:03:43.628801 kernel: ACPI: bus type drm_connector registered Mar 20 22:03:43.629309 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 20 22:03:43.634038 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 20 22:03:43.635724 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 22:03:43.642284 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 20 22:03:43.644473 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 20 22:03:43.645074 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 22:03:43.650832 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 20 22:03:43.656279 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 22:03:43.662068 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 22:03:43.665204 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 20 22:03:43.669836 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 22:03:43.673621 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 22:03:43.674862 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 22:03:43.675698 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 20 22:03:43.676461 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 22:03:43.677923 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 20 22:03:43.684549 systemd-journald[1100]: Time spent on flushing to /var/log/journal/5f8fd856ef814d98b596fabf9c6602f9 is 52.681ms for 961 entries. Mar 20 22:03:43.684549 systemd-journald[1100]: System Journal (/var/log/journal/5f8fd856ef814d98b596fabf9c6602f9) is 8M, max 584.8M, 576.8M free. Mar 20 22:03:43.764631 systemd-journald[1100]: Received client request to flush runtime journal. Mar 20 22:03:43.764703 kernel: loop0: detected capacity change from 0 to 205544 Mar 20 22:03:43.688849 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 20 22:03:43.692388 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 20 22:03:43.702871 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 20 22:03:43.712588 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 20 22:03:43.713933 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 20 22:03:43.717851 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 20 22:03:43.761605 udevadm[1142]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 20 22:03:43.768524 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 20 22:03:43.770763 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 22:03:43.789393 systemd-tmpfiles[1134]: ACLs are not supported, ignoring. Mar 20 22:03:43.789410 systemd-tmpfiles[1134]: ACLs are not supported, ignoring. Mar 20 22:03:43.797906 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 22:03:43.801530 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 20 22:03:43.817566 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 20 22:03:43.831935 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 20 22:03:43.865883 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 20 22:03:43.870044 kernel: loop1: detected capacity change from 0 to 109808 Mar 20 22:03:43.870869 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 22:03:43.907316 systemd-tmpfiles[1158]: ACLs are not supported, ignoring. Mar 20 22:03:43.907337 systemd-tmpfiles[1158]: ACLs are not supported, ignoring. Mar 20 22:03:43.920131 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 22:03:43.925092 kernel: loop2: detected capacity change from 0 to 151640 Mar 20 22:03:44.013743 kernel: loop3: detected capacity change from 0 to 8 Mar 20 22:03:44.051710 kernel: loop4: detected capacity change from 0 to 205544 Mar 20 22:03:44.122718 kernel: loop5: detected capacity change from 0 to 109808 Mar 20 22:03:44.177718 kernel: loop6: detected capacity change from 0 to 151640 Mar 20 22:03:44.217705 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 20 22:03:44.234699 kernel: loop7: detected capacity change from 0 to 8 Mar 20 22:03:44.234810 (sd-merge)[1165]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 20 22:03:44.235518 (sd-merge)[1165]: Merged extensions into '/usr'. Mar 20 22:03:44.246640 systemd[1]: Reload requested from client PID 1133 ('systemd-sysext') (unit systemd-sysext.service)... Mar 20 22:03:44.246654 systemd[1]: Reloading... Mar 20 22:03:44.374871 zram_generator::config[1194]: No configuration found. Mar 20 22:03:44.572545 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:03:44.631720 ldconfig[1128]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 20 22:03:44.654111 systemd[1]: Reloading finished in 407 ms. Mar 20 22:03:44.673837 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 20 22:03:44.674798 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 20 22:03:44.676320 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 20 22:03:44.683917 systemd[1]: Starting ensure-sysext.service... Mar 20 22:03:44.687804 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 22:03:44.690482 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 22:03:44.717279 systemd[1]: Reload requested from client PID 1251 ('systemctl') (unit ensure-sysext.service)... Mar 20 22:03:44.717296 systemd[1]: Reloading... Mar 20 22:03:44.738503 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 20 22:03:44.739611 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 20 22:03:44.747017 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 20 22:03:44.748826 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Mar 20 22:03:44.748975 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Mar 20 22:03:44.761217 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 22:03:44.761229 systemd-tmpfiles[1252]: Skipping /boot Mar 20 22:03:44.765807 systemd-udevd[1253]: Using default interface naming scheme 'v255'. Mar 20 22:03:44.777843 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 22:03:44.777855 systemd-tmpfiles[1252]: Skipping /boot Mar 20 22:03:44.805749 zram_generator::config[1283]: No configuration found. Mar 20 22:03:44.941729 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1303) Mar 20 22:03:45.002898 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 20 22:03:45.005710 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Mar 20 22:03:45.014277 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 20 22:03:45.007449 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:03:45.016992 kernel: ACPI: button: Power Button [PWRF] Mar 20 22:03:45.100705 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Mar 20 22:03:45.102711 kernel: mousedev: PS/2 mouse device common for all mice Mar 20 22:03:45.106700 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Mar 20 22:03:45.115916 kernel: Console: switching to colour dummy device 80x25 Mar 20 22:03:45.115977 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 20 22:03:45.115999 kernel: [drm] features: -context_init Mar 20 22:03:45.119695 kernel: [drm] number of scanouts: 1 Mar 20 22:03:45.119738 kernel: [drm] number of cap sets: 0 Mar 20 22:03:45.124714 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Mar 20 22:03:45.134413 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 20 22:03:45.134499 kernel: Console: switching to colour frame buffer device 160x50 Mar 20 22:03:45.137185 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 20 22:03:45.137800 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 22:03:45.139869 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 20 22:03:45.140386 systemd[1]: Reloading finished in 422 ms. Mar 20 22:03:45.152445 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 22:03:45.164323 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 22:03:45.192501 systemd[1]: Finished ensure-sysext.service. Mar 20 22:03:45.207496 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 20 22:03:45.225976 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:03:45.228136 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 22:03:45.239893 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 20 22:03:45.240310 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 22:03:45.242034 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 20 22:03:45.248982 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 22:03:45.258509 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 22:03:45.262309 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 22:03:45.267097 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 22:03:45.269109 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 22:03:45.273055 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 20 22:03:45.273212 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 22:03:45.280180 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 20 22:03:45.286054 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 22:03:45.291851 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 22:03:45.296954 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 20 22:03:45.301778 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 20 22:03:45.307694 lvm[1375]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 22:03:45.306791 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 22:03:45.308868 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 22:03:45.313574 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 22:03:45.313766 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 22:03:45.314109 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 22:03:45.314386 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 22:03:45.315274 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 22:03:45.315411 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 22:03:45.319649 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 22:03:45.319748 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 22:03:45.321978 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 20 22:03:45.326306 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 22:03:45.326497 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 22:03:45.341467 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 20 22:03:45.356313 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 20 22:03:45.364959 augenrules[1415]: No rules Mar 20 22:03:45.367967 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 22:03:45.368174 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 22:03:45.382092 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 20 22:03:45.385726 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 20 22:03:45.389277 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 22:03:45.395900 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 20 22:03:45.401738 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 20 22:03:45.417517 lvm[1424]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 22:03:45.421788 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 20 22:03:45.429796 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 20 22:03:45.463549 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 20 22:03:45.488721 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 22:03:45.500779 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 20 22:03:45.504322 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 20 22:03:45.537623 systemd-networkd[1394]: lo: Link UP Mar 20 22:03:45.537631 systemd-networkd[1394]: lo: Gained carrier Mar 20 22:03:45.543032 systemd-networkd[1394]: Enumeration completed Mar 20 22:03:45.543228 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 22:03:45.545787 systemd-networkd[1394]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:03:45.545792 systemd-networkd[1394]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 22:03:45.546376 systemd-networkd[1394]: eth0: Link UP Mar 20 22:03:45.546381 systemd-networkd[1394]: eth0: Gained carrier Mar 20 22:03:45.546397 systemd-networkd[1394]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 22:03:45.549471 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 20 22:03:45.555003 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 20 22:03:45.560084 systemd-networkd[1394]: eth0: DHCPv4 address 172.24.4.166/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 20 22:03:45.569819 systemd-resolved[1395]: Positive Trust Anchors: Mar 20 22:03:45.572456 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 20 22:03:45.573317 systemd[1]: Reached target time-set.target - System Time Set. Mar 20 22:03:45.573495 systemd-resolved[1395]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 22:03:45.573553 systemd-resolved[1395]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 22:03:45.580752 systemd-resolved[1395]: Using system hostname 'ci-9999-0-2-f-52bc1ad8d1.novalocal'. Mar 20 22:03:45.582274 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 22:03:45.583486 systemd[1]: Reached target network.target - Network. Mar 20 22:03:45.585514 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 22:03:45.586543 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 22:03:45.588014 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 20 22:03:45.589391 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 20 22:03:45.590922 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 20 22:03:45.592183 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 20 22:03:45.593368 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 20 22:03:45.594485 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 20 22:03:45.594584 systemd[1]: Reached target paths.target - Path Units. Mar 20 22:03:45.595747 systemd[1]: Reached target timers.target - Timer Units. Mar 20 22:03:45.599042 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 20 22:03:45.601863 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 20 22:03:45.606420 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 20 22:03:45.608362 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 20 22:03:45.609786 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 20 22:03:45.624333 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 20 22:03:45.625996 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 20 22:03:45.628595 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 20 22:03:45.630427 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 20 22:03:45.632398 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 22:03:45.634302 systemd[1]: Reached target basic.target - Basic System. Mar 20 22:03:45.635905 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 20 22:03:45.636019 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 20 22:03:45.638763 systemd[1]: Starting containerd.service - containerd container runtime... Mar 20 22:03:45.643631 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 20 22:03:45.649836 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 20 22:03:45.653882 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 20 22:03:45.661847 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 20 22:03:45.662585 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 20 22:03:45.668877 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 20 22:03:45.680019 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 20 22:03:45.686390 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 20 22:03:45.692731 jq[1449]: false Mar 20 22:03:45.693925 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 20 22:03:45.702856 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 20 22:03:45.704346 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 20 22:03:45.707590 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 20 22:03:45.708163 dbus-daemon[1448]: [system] SELinux support is enabled Mar 20 22:03:45.709923 systemd[1]: Starting update-engine.service - Update Engine... Mar 20 22:03:46.322800 systemd-timesyncd[1396]: Contacted time server 74.208.14.149:123 (0.flatcar.pool.ntp.org). Mar 20 22:03:46.322865 systemd-timesyncd[1396]: Initial clock synchronization to Thu 2025-03-20 22:03:46.322687 UTC. Mar 20 22:03:46.322947 systemd-resolved[1395]: Clock change detected. Flushing caches. Mar 20 22:03:46.327052 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 20 22:03:46.330523 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 20 22:03:46.334809 extend-filesystems[1450]: Found loop4 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found loop5 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found loop6 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found loop7 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda1 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda2 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda3 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found usr Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda4 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda6 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda7 Mar 20 22:03:46.346683 extend-filesystems[1450]: Found vda9 Mar 20 22:03:46.346683 extend-filesystems[1450]: Checking size of /dev/vda9 Mar 20 22:03:46.486650 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Mar 20 22:03:46.486682 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Mar 20 22:03:46.486698 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1314) Mar 20 22:03:46.486762 extend-filesystems[1450]: Resized partition /dev/vda9 Mar 20 22:03:46.364367 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 20 22:03:46.487496 extend-filesystems[1476]: resize2fs 1.47.2 (1-Jan-2025) Mar 20 22:03:46.487496 extend-filesystems[1476]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 20 22:03:46.487496 extend-filesystems[1476]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 20 22:03:46.487496 extend-filesystems[1476]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Mar 20 22:03:46.364911 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 20 22:03:46.501550 jq[1462]: true Mar 20 22:03:46.505852 extend-filesystems[1450]: Resized filesystem in /dev/vda9 Mar 20 22:03:46.365164 systemd[1]: motdgen.service: Deactivated successfully. Mar 20 22:03:46.509691 update_engine[1461]: I20250320 22:03:46.397806 1461 main.cc:92] Flatcar Update Engine starting Mar 20 22:03:46.509691 update_engine[1461]: I20250320 22:03:46.411552 1461 update_check_scheduler.cc:74] Next update check in 11m52s Mar 20 22:03:46.365319 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 20 22:03:46.380756 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 20 22:03:46.510143 jq[1477]: true Mar 20 22:03:46.381270 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 20 22:03:46.403961 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 20 22:03:46.403993 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 20 22:03:46.430787 systemd-logind[1458]: New seat seat0. Mar 20 22:03:46.434362 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 20 22:03:46.434385 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 20 22:03:46.458597 (ntainerd)[1483]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 20 22:03:46.459332 systemd[1]: Started update-engine.service - Update Engine. Mar 20 22:03:46.476483 systemd-logind[1458]: Watching system buttons on /dev/input/event1 (Power Button) Mar 20 22:03:46.476501 systemd-logind[1458]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 20 22:03:46.526430 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 20 22:03:46.528729 systemd[1]: Started systemd-logind.service - User Login Management. Mar 20 22:03:46.529587 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 20 22:03:46.533507 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 20 22:03:46.555645 tar[1475]: linux-amd64/helm Mar 20 22:03:46.637652 bash[1506]: Updated "/home/core/.ssh/authorized_keys" Mar 20 22:03:46.638684 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 20 22:03:46.645039 systemd[1]: Starting sshkeys.service... Mar 20 22:03:46.678874 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 20 22:03:46.683712 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 20 22:03:46.775612 sshd_keygen[1472]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 20 22:03:46.776776 locksmithd[1486]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 20 22:03:46.803343 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 20 22:03:46.810921 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 20 22:03:46.844168 systemd[1]: issuegen.service: Deactivated successfully. Mar 20 22:03:46.844681 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 20 22:03:46.854015 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 20 22:03:46.873699 containerd[1483]: time="2025-03-20T22:03:46Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 20 22:03:46.874650 containerd[1483]: time="2025-03-20T22:03:46.874305535Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 20 22:03:46.878840 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 20 22:03:46.886209 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 20 22:03:46.891306 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 20 22:03:46.892127 systemd[1]: Reached target getty.target - Login Prompts. Mar 20 22:03:46.899714 containerd[1483]: time="2025-03-20T22:03:46.899675626Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.434µs" Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.899785793Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.899812012Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.899963536Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.899981700Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900009212Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900067862Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900081588Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900298615Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900316448Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900328170Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900337277Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900646 containerd[1483]: time="2025-03-20T22:03:46.900413049Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900993 containerd[1483]: time="2025-03-20T22:03:46.900595071Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900993 containerd[1483]: time="2025-03-20T22:03:46.900639664Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 22:03:46.900993 containerd[1483]: time="2025-03-20T22:03:46.900653540Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 20 22:03:46.901694 containerd[1483]: time="2025-03-20T22:03:46.901666059Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 20 22:03:46.902037 containerd[1483]: time="2025-03-20T22:03:46.902008582Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 20 22:03:46.902101 containerd[1483]: time="2025-03-20T22:03:46.902075658Z" level=info msg="metadata content store policy set" policy=shared Mar 20 22:03:46.912651 containerd[1483]: time="2025-03-20T22:03:46.912603563Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 20 22:03:46.912735 containerd[1483]: time="2025-03-20T22:03:46.912674315Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 20 22:03:46.912735 containerd[1483]: time="2025-03-20T22:03:46.912693281Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 20 22:03:46.912735 containerd[1483]: time="2025-03-20T22:03:46.912720091Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 20 22:03:46.912822 containerd[1483]: time="2025-03-20T22:03:46.912736362Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 20 22:03:46.912822 containerd[1483]: time="2025-03-20T22:03:46.912749897Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 20 22:03:46.912822 containerd[1483]: time="2025-03-20T22:03:46.912766949Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 20 22:03:46.912822 containerd[1483]: time="2025-03-20T22:03:46.912785985Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 20 22:03:46.912822 containerd[1483]: time="2025-03-20T22:03:46.912803317Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 20 22:03:46.912822 containerd[1483]: time="2025-03-20T22:03:46.912818386Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 20 22:03:46.913045 containerd[1483]: time="2025-03-20T22:03:46.912832031Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 20 22:03:46.913045 containerd[1483]: time="2025-03-20T22:03:46.912854543Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 20 22:03:46.913045 containerd[1483]: time="2025-03-20T22:03:46.912997231Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 20 22:03:46.913045 containerd[1483]: time="2025-03-20T22:03:46.913021577Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 20 22:03:46.913045 containerd[1483]: time="2025-03-20T22:03:46.913035533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913048788Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913061582Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913073985Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913086338Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913098471Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913112307Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913125792Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 20 22:03:46.913148 containerd[1483]: time="2025-03-20T22:03:46.913138025Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 20 22:03:46.913416 containerd[1483]: time="2025-03-20T22:03:46.913199721Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 20 22:03:46.913416 containerd[1483]: time="2025-03-20T22:03:46.913216152Z" level=info msg="Start snapshots syncer" Mar 20 22:03:46.913416 containerd[1483]: time="2025-03-20T22:03:46.913246078Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 20 22:03:46.913547 containerd[1483]: time="2025-03-20T22:03:46.913503300Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 20 22:03:46.913691 containerd[1483]: time="2025-03-20T22:03:46.913567040Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 20 22:03:46.913691 containerd[1483]: time="2025-03-20T22:03:46.913658000Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 20 22:03:46.913809 containerd[1483]: time="2025-03-20T22:03:46.913764550Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 20 22:03:46.913809 containerd[1483]: time="2025-03-20T22:03:46.913789527Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 20 22:03:46.913809 containerd[1483]: time="2025-03-20T22:03:46.913801910Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 20 22:03:46.913872 containerd[1483]: time="2025-03-20T22:03:46.913814894Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 20 22:03:46.913872 containerd[1483]: time="2025-03-20T22:03:46.913829462Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 20 22:03:46.913872 containerd[1483]: time="2025-03-20T22:03:46.913841444Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 20 22:03:46.913872 containerd[1483]: time="2025-03-20T22:03:46.913855731Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 20 22:03:46.913953 containerd[1483]: time="2025-03-20T22:03:46.913884816Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 20 22:03:46.913953 containerd[1483]: time="2025-03-20T22:03:46.913920052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 20 22:03:46.913953 containerd[1483]: time="2025-03-20T22:03:46.913932385Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.913965046Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.913980064Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.913989672Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914000292Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914009730Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914019618Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914030699Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914047140Z" level=info msg="runtime interface created" Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914052941Z" level=info msg="created NRI interface" Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914063160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914075433Z" level=info msg="Connect containerd service" Mar 20 22:03:46.914098 containerd[1483]: time="2025-03-20T22:03:46.914100450Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 20 22:03:46.916863 containerd[1483]: time="2025-03-20T22:03:46.916839257Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076264819Z" level=info msg="Start subscribing containerd event" Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076334069Z" level=info msg="Start recovering state" Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076479592Z" level=info msg="Start event monitor" Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076483720Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076571284Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076503607Z" level=info msg="Start cni network conf syncer for default" Mar 20 22:03:47.076606 containerd[1483]: time="2025-03-20T22:03:47.076599747Z" level=info msg="Start streaming server" Mar 20 22:03:47.076920 containerd[1483]: time="2025-03-20T22:03:47.076613363Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 20 22:03:47.076920 containerd[1483]: time="2025-03-20T22:03:47.076647998Z" level=info msg="runtime interface starting up..." Mar 20 22:03:47.076920 containerd[1483]: time="2025-03-20T22:03:47.076656895Z" level=info msg="starting plugins..." Mar 20 22:03:47.076920 containerd[1483]: time="2025-03-20T22:03:47.076677864Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 20 22:03:47.076920 containerd[1483]: time="2025-03-20T22:03:47.076797749Z" level=info msg="containerd successfully booted in 0.204544s" Mar 20 22:03:47.077131 systemd[1]: Started containerd.service - containerd container runtime. Mar 20 22:03:47.205873 tar[1475]: linux-amd64/LICENSE Mar 20 22:03:47.206022 tar[1475]: linux-amd64/README.md Mar 20 22:03:47.228329 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 20 22:03:47.966967 systemd-networkd[1394]: eth0: Gained IPv6LL Mar 20 22:03:47.970438 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 20 22:03:47.975561 systemd[1]: Reached target network-online.target - Network is Online. Mar 20 22:03:47.984125 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:03:47.992284 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 20 22:03:48.062394 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 20 22:03:48.407206 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 20 22:03:48.413755 systemd[1]: Started sshd@0-172.24.4.166:22-172.24.4.1:47072.service - OpenSSH per-connection server daemon (172.24.4.1:47072). Mar 20 22:03:49.413954 sshd[1569]: Accepted publickey for core from 172.24.4.1 port 47072 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:03:49.417295 sshd-session[1569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:03:49.453648 systemd-logind[1458]: New session 1 of user core. Mar 20 22:03:49.458373 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 20 22:03:49.462584 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 20 22:03:49.503553 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 20 22:03:49.514278 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 20 22:03:49.529689 (systemd)[1574]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 20 22:03:49.533011 systemd-logind[1458]: New session c1 of user core. Mar 20 22:03:49.694962 systemd[1574]: Queued start job for default target default.target. Mar 20 22:03:49.701546 systemd[1574]: Created slice app.slice - User Application Slice. Mar 20 22:03:49.701568 systemd[1574]: Reached target paths.target - Paths. Mar 20 22:03:49.701606 systemd[1574]: Reached target timers.target - Timers. Mar 20 22:03:49.702974 systemd[1574]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 20 22:03:49.714931 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:03:49.720829 (kubelet)[1585]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:03:49.723784 systemd[1574]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 20 22:03:49.725030 systemd[1574]: Reached target sockets.target - Sockets. Mar 20 22:03:49.725082 systemd[1574]: Reached target basic.target - Basic System. Mar 20 22:03:49.725117 systemd[1574]: Reached target default.target - Main User Target. Mar 20 22:03:49.725142 systemd[1574]: Startup finished in 186ms. Mar 20 22:03:49.725273 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 20 22:03:49.735855 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 20 22:03:50.046791 systemd[1]: Started sshd@1-172.24.4.166:22-172.24.4.1:47086.service - OpenSSH per-connection server daemon (172.24.4.1:47086). Mar 20 22:03:51.163660 kubelet[1585]: E0320 22:03:51.163596 1585 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:03:51.167369 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:03:51.167526 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:03:51.168049 systemd[1]: kubelet.service: Consumed 1.890s CPU time, 239M memory peak. Mar 20 22:03:51.767961 sshd[1596]: Accepted publickey for core from 172.24.4.1 port 47086 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:03:51.770671 sshd-session[1596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:03:51.781272 systemd-logind[1458]: New session 2 of user core. Mar 20 22:03:51.792042 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 20 22:03:52.132396 login[1536]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 22:03:52.141070 login[1537]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 22:03:52.145210 systemd-logind[1458]: New session 4 of user core. Mar 20 22:03:52.159103 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 20 22:03:52.166503 systemd-logind[1458]: New session 3 of user core. Mar 20 22:03:52.174276 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 20 22:03:52.467969 sshd[1600]: Connection closed by 172.24.4.1 port 47086 Mar 20 22:03:52.470403 sshd-session[1596]: pam_unix(sshd:session): session closed for user core Mar 20 22:03:52.487059 systemd[1]: sshd@1-172.24.4.166:22-172.24.4.1:47086.service: Deactivated successfully. Mar 20 22:03:52.490783 systemd[1]: session-2.scope: Deactivated successfully. Mar 20 22:03:52.493172 systemd-logind[1458]: Session 2 logged out. Waiting for processes to exit. Mar 20 22:03:52.497530 systemd[1]: Started sshd@2-172.24.4.166:22-172.24.4.1:47098.service - OpenSSH per-connection server daemon (172.24.4.1:47098). Mar 20 22:03:52.500924 systemd-logind[1458]: Removed session 2. Mar 20 22:03:53.331217 coreos-metadata[1447]: Mar 20 22:03:53.331 WARN failed to locate config-drive, using the metadata service API instead Mar 20 22:03:53.381129 coreos-metadata[1447]: Mar 20 22:03:53.381 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 20 22:03:53.574416 coreos-metadata[1447]: Mar 20 22:03:53.574 INFO Fetch successful Mar 20 22:03:53.574416 coreos-metadata[1447]: Mar 20 22:03:53.574 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 20 22:03:53.591080 coreos-metadata[1447]: Mar 20 22:03:53.590 INFO Fetch successful Mar 20 22:03:53.591080 coreos-metadata[1447]: Mar 20 22:03:53.590 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 20 22:03:53.606015 coreos-metadata[1447]: Mar 20 22:03:53.605 INFO Fetch successful Mar 20 22:03:53.606015 coreos-metadata[1447]: Mar 20 22:03:53.605 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 20 22:03:53.619915 coreos-metadata[1447]: Mar 20 22:03:53.619 INFO Fetch successful Mar 20 22:03:53.619915 coreos-metadata[1447]: Mar 20 22:03:53.619 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 20 22:03:53.636772 coreos-metadata[1447]: Mar 20 22:03:53.636 INFO Fetch successful Mar 20 22:03:53.636772 coreos-metadata[1447]: Mar 20 22:03:53.636 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 20 22:03:53.651294 coreos-metadata[1447]: Mar 20 22:03:53.651 INFO Fetch successful Mar 20 22:03:53.700975 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 20 22:03:53.702931 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 20 22:03:53.768401 coreos-metadata[1513]: Mar 20 22:03:53.768 WARN failed to locate config-drive, using the metadata service API instead Mar 20 22:03:53.811109 coreos-metadata[1513]: Mar 20 22:03:53.811 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 20 22:03:53.827708 coreos-metadata[1513]: Mar 20 22:03:53.827 INFO Fetch successful Mar 20 22:03:53.827708 coreos-metadata[1513]: Mar 20 22:03:53.827 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 20 22:03:53.842173 coreos-metadata[1513]: Mar 20 22:03:53.842 INFO Fetch successful Mar 20 22:03:53.848990 unknown[1513]: wrote ssh authorized keys file for user: core Mar 20 22:03:53.889688 update-ssh-keys[1643]: Updated "/home/core/.ssh/authorized_keys" Mar 20 22:03:53.892301 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 20 22:03:53.894713 systemd[1]: Finished sshkeys.service. Mar 20 22:03:53.900411 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 20 22:03:53.900861 systemd[1]: Startup finished in 1.261s (kernel) + 21.646s (initrd) + 10.904s (userspace) = 33.812s. Mar 20 22:03:53.985939 sshd[1631]: Accepted publickey for core from 172.24.4.1 port 47098 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:03:53.988671 sshd-session[1631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:03:54.001547 systemd-logind[1458]: New session 5 of user core. Mar 20 22:03:54.011965 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 20 22:03:54.673674 sshd[1647]: Connection closed by 172.24.4.1 port 47098 Mar 20 22:03:54.672426 sshd-session[1631]: pam_unix(sshd:session): session closed for user core Mar 20 22:03:54.678959 systemd[1]: sshd@2-172.24.4.166:22-172.24.4.1:47098.service: Deactivated successfully. Mar 20 22:03:54.683049 systemd[1]: session-5.scope: Deactivated successfully. Mar 20 22:03:54.686462 systemd-logind[1458]: Session 5 logged out. Waiting for processes to exit. Mar 20 22:03:54.689059 systemd-logind[1458]: Removed session 5. Mar 20 22:04:01.419263 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 20 22:04:01.422491 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:01.799266 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:01.810125 (kubelet)[1660]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:04:01.972170 kubelet[1660]: E0320 22:04:01.972011 1660 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:04:01.978178 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:04:01.978489 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:04:01.979100 systemd[1]: kubelet.service: Consumed 290ms CPU time, 95.9M memory peak. Mar 20 22:04:04.691666 systemd[1]: Started sshd@3-172.24.4.166:22-172.24.4.1:34732.service - OpenSSH per-connection server daemon (172.24.4.1:34732). Mar 20 22:04:05.931674 sshd[1668]: Accepted publickey for core from 172.24.4.1 port 34732 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:04:05.933886 sshd-session[1668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:04:05.943068 systemd-logind[1458]: New session 6 of user core. Mar 20 22:04:05.953046 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 20 22:04:06.594508 sshd[1670]: Connection closed by 172.24.4.1 port 34732 Mar 20 22:04:06.594968 sshd-session[1668]: pam_unix(sshd:session): session closed for user core Mar 20 22:04:06.611671 systemd[1]: sshd@3-172.24.4.166:22-172.24.4.1:34732.service: Deactivated successfully. Mar 20 22:04:06.615202 systemd[1]: session-6.scope: Deactivated successfully. Mar 20 22:04:06.619011 systemd-logind[1458]: Session 6 logged out. Waiting for processes to exit. Mar 20 22:04:06.622059 systemd[1]: Started sshd@4-172.24.4.166:22-172.24.4.1:34736.service - OpenSSH per-connection server daemon (172.24.4.1:34736). Mar 20 22:04:06.625671 systemd-logind[1458]: Removed session 6. Mar 20 22:04:07.943474 sshd[1675]: Accepted publickey for core from 172.24.4.1 port 34736 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:04:07.946106 sshd-session[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:04:07.957150 systemd-logind[1458]: New session 7 of user core. Mar 20 22:04:07.966904 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 20 22:04:08.651090 sshd[1678]: Connection closed by 172.24.4.1 port 34736 Mar 20 22:04:08.652155 sshd-session[1675]: pam_unix(sshd:session): session closed for user core Mar 20 22:04:08.667911 systemd[1]: sshd@4-172.24.4.166:22-172.24.4.1:34736.service: Deactivated successfully. Mar 20 22:04:08.670942 systemd[1]: session-7.scope: Deactivated successfully. Mar 20 22:04:08.672354 systemd-logind[1458]: Session 7 logged out. Waiting for processes to exit. Mar 20 22:04:08.676894 systemd[1]: Started sshd@5-172.24.4.166:22-172.24.4.1:34746.service - OpenSSH per-connection server daemon (172.24.4.1:34746). Mar 20 22:04:08.680245 systemd-logind[1458]: Removed session 7. Mar 20 22:04:10.018013 sshd[1683]: Accepted publickey for core from 172.24.4.1 port 34746 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:04:10.020873 sshd-session[1683]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:04:10.034510 systemd-logind[1458]: New session 8 of user core. Mar 20 22:04:10.045915 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 20 22:04:10.642516 sshd[1686]: Connection closed by 172.24.4.1 port 34746 Mar 20 22:04:10.643279 sshd-session[1683]: pam_unix(sshd:session): session closed for user core Mar 20 22:04:10.659239 systemd[1]: sshd@5-172.24.4.166:22-172.24.4.1:34746.service: Deactivated successfully. Mar 20 22:04:10.662340 systemd[1]: session-8.scope: Deactivated successfully. Mar 20 22:04:10.665249 systemd-logind[1458]: Session 8 logged out. Waiting for processes to exit. Mar 20 22:04:10.669062 systemd[1]: Started sshd@6-172.24.4.166:22-172.24.4.1:34752.service - OpenSSH per-connection server daemon (172.24.4.1:34752). Mar 20 22:04:10.672000 systemd-logind[1458]: Removed session 8. Mar 20 22:04:11.999614 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 20 22:04:12.002742 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:12.127752 sshd[1691]: Accepted publickey for core from 172.24.4.1 port 34752 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:04:12.130140 sshd-session[1691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:04:12.145214 systemd-logind[1458]: New session 9 of user core. Mar 20 22:04:12.151754 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 20 22:04:12.347491 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:12.362614 (kubelet)[1703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:04:12.491302 kubelet[1703]: E0320 22:04:12.491188 1703 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:04:12.494527 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:04:12.494854 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:04:12.495571 systemd[1]: kubelet.service: Consumed 293ms CPU time, 96.5M memory peak. Mar 20 22:04:12.613847 sudo[1710]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 20 22:04:12.614478 sudo[1710]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:04:12.634180 sudo[1710]: pam_unix(sudo:session): session closed for user root Mar 20 22:04:12.948345 sshd[1697]: Connection closed by 172.24.4.1 port 34752 Mar 20 22:04:12.950547 sshd-session[1691]: pam_unix(sshd:session): session closed for user core Mar 20 22:04:12.963316 systemd[1]: sshd@6-172.24.4.166:22-172.24.4.1:34752.service: Deactivated successfully. Mar 20 22:04:12.966508 systemd[1]: session-9.scope: Deactivated successfully. Mar 20 22:04:12.969486 systemd-logind[1458]: Session 9 logged out. Waiting for processes to exit. Mar 20 22:04:12.973245 systemd[1]: Started sshd@7-172.24.4.166:22-172.24.4.1:34758.service - OpenSSH per-connection server daemon (172.24.4.1:34758). Mar 20 22:04:12.975547 systemd-logind[1458]: Removed session 9. Mar 20 22:04:14.178290 sshd[1715]: Accepted publickey for core from 172.24.4.1 port 34758 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:04:14.180959 sshd-session[1715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:04:14.192115 systemd-logind[1458]: New session 10 of user core. Mar 20 22:04:14.203125 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 20 22:04:14.617022 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 20 22:04:14.617574 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:04:14.625887 sudo[1720]: pam_unix(sudo:session): session closed for user root Mar 20 22:04:14.637398 sudo[1719]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 20 22:04:14.638153 sudo[1719]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:04:14.659731 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 22:04:14.734149 augenrules[1742]: No rules Mar 20 22:04:14.735181 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 22:04:14.735541 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 22:04:14.737919 sudo[1719]: pam_unix(sudo:session): session closed for user root Mar 20 22:04:14.933173 sshd[1718]: Connection closed by 172.24.4.1 port 34758 Mar 20 22:04:14.930747 sshd-session[1715]: pam_unix(sshd:session): session closed for user core Mar 20 22:04:14.947013 systemd[1]: sshd@7-172.24.4.166:22-172.24.4.1:34758.service: Deactivated successfully. Mar 20 22:04:14.950088 systemd[1]: session-10.scope: Deactivated successfully. Mar 20 22:04:14.951509 systemd-logind[1458]: Session 10 logged out. Waiting for processes to exit. Mar 20 22:04:14.955394 systemd[1]: Started sshd@8-172.24.4.166:22-172.24.4.1:39198.service - OpenSSH per-connection server daemon (172.24.4.1:39198). Mar 20 22:04:14.958506 systemd-logind[1458]: Removed session 10. Mar 20 22:04:16.303604 sshd[1750]: Accepted publickey for core from 172.24.4.1 port 39198 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:04:16.306184 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:04:16.316768 systemd-logind[1458]: New session 11 of user core. Mar 20 22:04:16.325993 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 20 22:04:16.741244 sudo[1754]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 20 22:04:16.741910 sudo[1754]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 22:04:17.595382 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 20 22:04:17.607912 (dockerd)[1772]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 20 22:04:18.042388 dockerd[1772]: time="2025-03-20T22:04:18.042303477Z" level=info msg="Starting up" Mar 20 22:04:18.047858 dockerd[1772]: time="2025-03-20T22:04:18.047820063Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 20 22:04:18.089782 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport254951457-merged.mount: Deactivated successfully. Mar 20 22:04:18.128210 dockerd[1772]: time="2025-03-20T22:04:18.128117321Z" level=info msg="Loading containers: start." Mar 20 22:04:18.380713 kernel: Initializing XFRM netlink socket Mar 20 22:04:18.498744 systemd-networkd[1394]: docker0: Link UP Mar 20 22:04:18.558010 dockerd[1772]: time="2025-03-20T22:04:18.557948810Z" level=info msg="Loading containers: done." Mar 20 22:04:18.580689 dockerd[1772]: time="2025-03-20T22:04:18.580552483Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 20 22:04:18.580821 dockerd[1772]: time="2025-03-20T22:04:18.580772726Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 20 22:04:18.581023 dockerd[1772]: time="2025-03-20T22:04:18.580972811Z" level=info msg="Daemon has completed initialization" Mar 20 22:04:18.649059 dockerd[1772]: time="2025-03-20T22:04:18.648553589Z" level=info msg="API listen on /run/docker.sock" Mar 20 22:04:18.649327 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 20 22:04:20.124027 containerd[1483]: time="2025-03-20T22:04:20.123495336Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 20 22:04:20.856409 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3334755754.mount: Deactivated successfully. Mar 20 22:04:22.498558 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 20 22:04:22.503056 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:22.672512 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:22.681899 (kubelet)[2031]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:04:22.919788 containerd[1483]: time="2025-03-20T22:04:22.918420220Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:22.923037 containerd[1483]: time="2025-03-20T22:04:22.922266470Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=27959276" Mar 20 22:04:22.929017 containerd[1483]: time="2025-03-20T22:04:22.928792560Z" level=info msg="ImageCreate event name:\"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:22.937949 containerd[1483]: time="2025-03-20T22:04:22.937884167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:22.944307 containerd[1483]: time="2025-03-20T22:04:22.944242072Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"27956068\" in 2.820653791s" Mar 20 22:04:22.944573 containerd[1483]: time="2025-03-20T22:04:22.944508371Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\"" Mar 20 22:04:22.950702 containerd[1483]: time="2025-03-20T22:04:22.950575731Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 20 22:04:22.965694 kubelet[2031]: E0320 22:04:22.965095 2031 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:04:22.969531 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:04:22.970597 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:04:22.971714 systemd[1]: kubelet.service: Consumed 187ms CPU time, 98.3M memory peak. Mar 20 22:04:24.861434 containerd[1483]: time="2025-03-20T22:04:24.861365252Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:24.862645 containerd[1483]: time="2025-03-20T22:04:24.862553052Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=24713784" Mar 20 22:04:24.863946 containerd[1483]: time="2025-03-20T22:04:24.863903166Z" level=info msg="ImageCreate event name:\"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:24.867899 containerd[1483]: time="2025-03-20T22:04:24.867842801Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:24.870112 containerd[1483]: time="2025-03-20T22:04:24.870086973Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"26201384\" in 1.918921576s" Mar 20 22:04:24.870967 containerd[1483]: time="2025-03-20T22:04:24.870184376Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\"" Mar 20 22:04:24.872851 containerd[1483]: time="2025-03-20T22:04:24.872830913Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 20 22:04:26.740609 containerd[1483]: time="2025-03-20T22:04:26.740538821Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:26.742045 containerd[1483]: time="2025-03-20T22:04:26.741756086Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=18780376" Mar 20 22:04:26.743494 containerd[1483]: time="2025-03-20T22:04:26.743435027Z" level=info msg="ImageCreate event name:\"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:26.746584 containerd[1483]: time="2025-03-20T22:04:26.746539283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:26.747750 containerd[1483]: time="2025-03-20T22:04:26.747575818Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"20267994\" in 1.87465705s" Mar 20 22:04:26.747750 containerd[1483]: time="2025-03-20T22:04:26.747615403Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\"" Mar 20 22:04:26.748168 containerd[1483]: time="2025-03-20T22:04:26.748142842Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 20 22:04:28.162374 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3604138990.mount: Deactivated successfully. Mar 20 22:04:28.697421 containerd[1483]: time="2025-03-20T22:04:28.697362775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:28.698683 containerd[1483]: time="2025-03-20T22:04:28.698610376Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=30354638" Mar 20 22:04:28.699668 containerd[1483]: time="2025-03-20T22:04:28.699586418Z" level=info msg="ImageCreate event name:\"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:28.701728 containerd[1483]: time="2025-03-20T22:04:28.701675558Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:28.702416 containerd[1483]: time="2025-03-20T22:04:28.702254725Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"30353649\" in 1.954081216s" Mar 20 22:04:28.702416 containerd[1483]: time="2025-03-20T22:04:28.702297957Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\"" Mar 20 22:04:28.703005 containerd[1483]: time="2025-03-20T22:04:28.702983644Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 20 22:04:29.354585 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2082218532.mount: Deactivated successfully. Mar 20 22:04:30.763808 containerd[1483]: time="2025-03-20T22:04:30.763676838Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:30.766446 containerd[1483]: time="2025-03-20T22:04:30.766175978Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Mar 20 22:04:30.768215 containerd[1483]: time="2025-03-20T22:04:30.768080282Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:30.774607 containerd[1483]: time="2025-03-20T22:04:30.774504466Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:30.778154 containerd[1483]: time="2025-03-20T22:04:30.777817142Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.074713223s" Mar 20 22:04:30.778154 containerd[1483]: time="2025-03-20T22:04:30.777922631Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 20 22:04:30.779478 containerd[1483]: time="2025-03-20T22:04:30.779052281Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 20 22:04:31.356076 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2875442806.mount: Deactivated successfully. Mar 20 22:04:31.367650 containerd[1483]: time="2025-03-20T22:04:31.367531265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:04:31.369437 containerd[1483]: time="2025-03-20T22:04:31.369322917Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 20 22:04:31.371156 containerd[1483]: time="2025-03-20T22:04:31.371028087Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:04:31.376156 containerd[1483]: time="2025-03-20T22:04:31.376020544Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 22:04:31.377992 containerd[1483]: time="2025-03-20T22:04:31.377760349Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 598.644249ms" Mar 20 22:04:31.377992 containerd[1483]: time="2025-03-20T22:04:31.377827485Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 20 22:04:31.380013 containerd[1483]: time="2025-03-20T22:04:31.379550669Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 20 22:04:31.955338 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2718372126.mount: Deactivated successfully. Mar 20 22:04:32.102856 update_engine[1461]: I20250320 22:04:32.102653 1461 update_attempter.cc:509] Updating boot flags... Mar 20 22:04:32.147993 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2123) Mar 20 22:04:32.230739 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2125) Mar 20 22:04:32.999917 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 20 22:04:33.005006 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:33.639943 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:33.654590 (kubelet)[2178]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 22:04:33.731944 kubelet[2178]: E0320 22:04:33.731461 2178 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 22:04:33.733660 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 22:04:33.733804 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 22:04:33.734079 systemd[1]: kubelet.service: Consumed 217ms CPU time, 95.7M memory peak. Mar 20 22:04:34.971293 containerd[1483]: time="2025-03-20T22:04:34.971176561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:34.975946 containerd[1483]: time="2025-03-20T22:04:34.975808671Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779981" Mar 20 22:04:34.979019 containerd[1483]: time="2025-03-20T22:04:34.978864024Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:35.062687 containerd[1483]: time="2025-03-20T22:04:35.061934124Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:35.066020 containerd[1483]: time="2025-03-20T22:04:35.065589601Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.685974601s" Mar 20 22:04:35.066020 containerd[1483]: time="2025-03-20T22:04:35.065729263Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Mar 20 22:04:39.262587 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:39.263096 systemd[1]: kubelet.service: Consumed 217ms CPU time, 95.7M memory peak. Mar 20 22:04:39.268141 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:39.311288 systemd[1]: Reload requested from client PID 2215 ('systemctl') (unit session-11.scope)... Mar 20 22:04:39.311426 systemd[1]: Reloading... Mar 20 22:04:39.437687 zram_generator::config[2261]: No configuration found. Mar 20 22:04:39.590826 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:04:39.712328 systemd[1]: Reloading finished in 400 ms. Mar 20 22:04:39.770274 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 20 22:04:39.770356 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 20 22:04:39.770712 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:39.770750 systemd[1]: kubelet.service: Consumed 163ms CPU time, 83.6M memory peak. Mar 20 22:04:39.772821 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:40.456966 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:40.480274 (kubelet)[2328]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 22:04:40.564649 kubelet[2328]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:04:40.564649 kubelet[2328]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 22:04:40.564649 kubelet[2328]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:04:40.565070 kubelet[2328]: I0320 22:04:40.564908 2328 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 22:04:41.241810 kubelet[2328]: I0320 22:04:41.241713 2328 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 20 22:04:41.241810 kubelet[2328]: I0320 22:04:41.241794 2328 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 22:04:41.243646 kubelet[2328]: I0320 22:04:41.242195 2328 server.go:929] "Client rotation is on, will bootstrap in background" Mar 20 22:04:42.082577 kubelet[2328]: E0320 22:04:42.082241 2328 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.166:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:42.082577 kubelet[2328]: I0320 22:04:42.082296 2328 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 22:04:42.101280 kubelet[2328]: I0320 22:04:42.100953 2328 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 22:04:42.111210 kubelet[2328]: I0320 22:04:42.111170 2328 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 22:04:42.114400 kubelet[2328]: I0320 22:04:42.114327 2328 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 22:04:42.114761 kubelet[2328]: I0320 22:04:42.114671 2328 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 22:04:42.115138 kubelet[2328]: I0320 22:04:42.114738 2328 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-2-f-52bc1ad8d1.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 22:04:42.115138 kubelet[2328]: I0320 22:04:42.115123 2328 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 22:04:42.115459 kubelet[2328]: I0320 22:04:42.115159 2328 container_manager_linux.go:300] "Creating device plugin manager" Mar 20 22:04:42.115459 kubelet[2328]: I0320 22:04:42.115358 2328 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:04:42.120383 kubelet[2328]: I0320 22:04:42.120013 2328 kubelet.go:408] "Attempting to sync node with API server" Mar 20 22:04:42.120383 kubelet[2328]: I0320 22:04:42.120061 2328 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 22:04:42.120383 kubelet[2328]: I0320 22:04:42.120109 2328 kubelet.go:314] "Adding apiserver pod source" Mar 20 22:04:42.120383 kubelet[2328]: I0320 22:04:42.120134 2328 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 22:04:42.135814 kubelet[2328]: W0320 22:04:42.134744 2328 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.166:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-f-52bc1ad8d1.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.166:6443: connect: connection refused Mar 20 22:04:42.135814 kubelet[2328]: E0320 22:04:42.134877 2328 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.166:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-f-52bc1ad8d1.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:42.135814 kubelet[2328]: W0320 22:04:42.135603 2328 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.166:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.166:6443: connect: connection refused Mar 20 22:04:42.135814 kubelet[2328]: E0320 22:04:42.135732 2328 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.166:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:42.136850 kubelet[2328]: I0320 22:04:42.136688 2328 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 22:04:42.141243 kubelet[2328]: I0320 22:04:42.141010 2328 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 22:04:42.143303 kubelet[2328]: W0320 22:04:42.143271 2328 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 20 22:04:42.145601 kubelet[2328]: I0320 22:04:42.145249 2328 server.go:1269] "Started kubelet" Mar 20 22:04:42.148690 kubelet[2328]: I0320 22:04:42.148107 2328 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 22:04:42.157145 kubelet[2328]: I0320 22:04:42.157034 2328 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 22:04:42.157694 kubelet[2328]: I0320 22:04:42.157592 2328 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 22:04:42.159020 kubelet[2328]: I0320 22:04:42.158965 2328 server.go:460] "Adding debug handlers to kubelet server" Mar 20 22:04:42.167342 kubelet[2328]: E0320 22:04:42.161684 2328 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.166:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.166:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-0-2-f-52bc1ad8d1.novalocal.182ea20ce7707b86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-2-f-52bc1ad8d1.novalocal,UID:ci-9999-0-2-f-52bc1ad8d1.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-2-f-52bc1ad8d1.novalocal,},FirstTimestamp:2025-03-20 22:04:42.145176454 +0000 UTC m=+1.657792353,LastTimestamp:2025-03-20 22:04:42.145176454 +0000 UTC m=+1.657792353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-2-f-52bc1ad8d1.novalocal,}" Mar 20 22:04:42.170695 kubelet[2328]: I0320 22:04:42.157620 2328 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 22:04:42.171012 kubelet[2328]: I0320 22:04:42.170976 2328 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 20 22:04:42.174142 kubelet[2328]: I0320 22:04:42.174102 2328 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 22:04:42.174717 kubelet[2328]: E0320 22:04:42.174676 2328 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-2-f-52bc1ad8d1.novalocal\" not found" Mar 20 22:04:42.177385 kubelet[2328]: E0320 22:04:42.177348 2328 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-f-52bc1ad8d1.novalocal?timeout=10s\": dial tcp 172.24.4.166:6443: connect: connection refused" interval="200ms" Mar 20 22:04:42.177787 kubelet[2328]: I0320 22:04:42.177774 2328 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 22:04:42.178679 kubelet[2328]: W0320 22:04:42.178220 2328 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.166:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.166:6443: connect: connection refused Mar 20 22:04:42.178679 kubelet[2328]: E0320 22:04:42.178273 2328 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.166:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:42.178679 kubelet[2328]: I0320 22:04:42.178322 2328 reconciler.go:26] "Reconciler: start to sync state" Mar 20 22:04:42.180178 kubelet[2328]: I0320 22:04:42.180160 2328 factory.go:221] Registration of the systemd container factory successfully Mar 20 22:04:42.180335 kubelet[2328]: I0320 22:04:42.180315 2328 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 22:04:42.181960 kubelet[2328]: I0320 22:04:42.181944 2328 factory.go:221] Registration of the containerd container factory successfully Mar 20 22:04:42.198132 kubelet[2328]: I0320 22:04:42.198050 2328 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 22:04:42.199655 kubelet[2328]: I0320 22:04:42.199592 2328 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 22:04:42.199722 kubelet[2328]: I0320 22:04:42.199709 2328 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 22:04:42.199774 kubelet[2328]: I0320 22:04:42.199754 2328 kubelet.go:2321] "Starting kubelet main sync loop" Mar 20 22:04:42.199890 kubelet[2328]: E0320 22:04:42.199844 2328 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 22:04:42.211247 kubelet[2328]: W0320 22:04:42.211144 2328 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.166:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.166:6443: connect: connection refused Mar 20 22:04:42.211454 kubelet[2328]: E0320 22:04:42.211269 2328 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.166:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:42.215260 kubelet[2328]: I0320 22:04:42.215128 2328 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 22:04:42.215260 kubelet[2328]: I0320 22:04:42.215143 2328 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 22:04:42.215260 kubelet[2328]: I0320 22:04:42.215158 2328 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:04:42.220038 kubelet[2328]: I0320 22:04:42.219943 2328 policy_none.go:49] "None policy: Start" Mar 20 22:04:42.220679 kubelet[2328]: I0320 22:04:42.220642 2328 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 22:04:42.220679 kubelet[2328]: I0320 22:04:42.220664 2328 state_mem.go:35] "Initializing new in-memory state store" Mar 20 22:04:42.230317 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 20 22:04:42.245383 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 20 22:04:42.249487 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 20 22:04:42.255498 kubelet[2328]: I0320 22:04:42.255435 2328 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 22:04:42.255803 kubelet[2328]: I0320 22:04:42.255608 2328 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 22:04:42.255803 kubelet[2328]: I0320 22:04:42.255635 2328 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 22:04:42.256223 kubelet[2328]: I0320 22:04:42.256096 2328 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 22:04:42.259399 kubelet[2328]: E0320 22:04:42.259277 2328 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-0-2-f-52bc1ad8d1.novalocal\" not found" Mar 20 22:04:42.314849 systemd[1]: Created slice kubepods-burstable-pod14049434aee82e13cb5f856d2f449e31.slice - libcontainer container kubepods-burstable-pod14049434aee82e13cb5f856d2f449e31.slice. Mar 20 22:04:42.331974 systemd[1]: Created slice kubepods-burstable-pod5821a8a65f5a893df1d692ff4b8f78da.slice - libcontainer container kubepods-burstable-pod5821a8a65f5a893df1d692ff4b8f78da.slice. Mar 20 22:04:42.346081 systemd[1]: Created slice kubepods-burstable-pod3ddb92c09a3fa308b1d8de8377de1088.slice - libcontainer container kubepods-burstable-pod3ddb92c09a3fa308b1d8de8377de1088.slice. Mar 20 22:04:42.358426 kubelet[2328]: I0320 22:04:42.358404 2328 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.359072 kubelet[2328]: E0320 22:04:42.359008 2328 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.166:6443/api/v1/nodes\": dial tcp 172.24.4.166:6443: connect: connection refused" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.378859 kubelet[2328]: E0320 22:04:42.378833 2328 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-f-52bc1ad8d1.novalocal?timeout=10s\": dial tcp 172.24.4.166:6443: connect: connection refused" interval="400ms" Mar 20 22:04:42.380808 kubelet[2328]: I0320 22:04:42.380119 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-ca-certs\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.380808 kubelet[2328]: I0320 22:04:42.380186 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.380808 kubelet[2328]: I0320 22:04:42.380239 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5821a8a65f5a893df1d692ff4b8f78da-kubeconfig\") pod \"kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"5821a8a65f5a893df1d692ff4b8f78da\") " pod="kube-system/kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.380808 kubelet[2328]: I0320 22:04:42.380312 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3ddb92c09a3fa308b1d8de8377de1088-k8s-certs\") pod \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"3ddb92c09a3fa308b1d8de8377de1088\") " pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.381154 kubelet[2328]: I0320 22:04:42.380362 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3ddb92c09a3fa308b1d8de8377de1088-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"3ddb92c09a3fa308b1d8de8377de1088\") " pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.381154 kubelet[2328]: I0320 22:04:42.380410 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.381154 kubelet[2328]: I0320 22:04:42.380453 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.381154 kubelet[2328]: I0320 22:04:42.380497 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.381394 kubelet[2328]: I0320 22:04:42.380573 2328 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3ddb92c09a3fa308b1d8de8377de1088-ca-certs\") pod \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"3ddb92c09a3fa308b1d8de8377de1088\") " pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.562204 kubelet[2328]: I0320 22:04:42.562155 2328 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.562995 kubelet[2328]: E0320 22:04:42.562935 2328 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.166:6443/api/v1/nodes\": dial tcp 172.24.4.166:6443: connect: connection refused" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.632356 containerd[1483]: time="2025-03-20T22:04:42.631205592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal,Uid:14049434aee82e13cb5f856d2f449e31,Namespace:kube-system,Attempt:0,}" Mar 20 22:04:42.645688 containerd[1483]: time="2025-03-20T22:04:42.645510838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal,Uid:5821a8a65f5a893df1d692ff4b8f78da,Namespace:kube-system,Attempt:0,}" Mar 20 22:04:42.653536 containerd[1483]: time="2025-03-20T22:04:42.653102280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal,Uid:3ddb92c09a3fa308b1d8de8377de1088,Namespace:kube-system,Attempt:0,}" Mar 20 22:04:42.712686 containerd[1483]: time="2025-03-20T22:04:42.712523392Z" level=info msg="connecting to shim 8bf4f7b430cf1a576df372e71f754681320d49ad60dc8b951880142dd428b13c" address="unix:///run/containerd/s/9ffad3ebbb017f7c8dbcd833b20d48748ca4f25ee2527089e754a45800d53164" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:04:42.722283 containerd[1483]: time="2025-03-20T22:04:42.722172114Z" level=info msg="connecting to shim 9f54a6863b3813db790d85e24fd385576defa21e478c67a7ebf67e8987eda4c4" address="unix:///run/containerd/s/949c73e3e9920183b27778b0e2418862add13467e4c99897f4b586284421069e" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:04:42.761439 containerd[1483]: time="2025-03-20T22:04:42.760924962Z" level=info msg="connecting to shim 11028ca05b9e5453938c55e5b694a67cf3d2b0e54d6d4a850aefdfa80444b9cf" address="unix:///run/containerd/s/f704f4aa5de6f70cdb2e4252d9561edcea1eec7f34ddde782045337fe851c8b6" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:04:42.772839 systemd[1]: Started cri-containerd-9f54a6863b3813db790d85e24fd385576defa21e478c67a7ebf67e8987eda4c4.scope - libcontainer container 9f54a6863b3813db790d85e24fd385576defa21e478c67a7ebf67e8987eda4c4. Mar 20 22:04:42.778964 systemd[1]: Started cri-containerd-8bf4f7b430cf1a576df372e71f754681320d49ad60dc8b951880142dd428b13c.scope - libcontainer container 8bf4f7b430cf1a576df372e71f754681320d49ad60dc8b951880142dd428b13c. Mar 20 22:04:42.779880 kubelet[2328]: E0320 22:04:42.779439 2328 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.166:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-2-f-52bc1ad8d1.novalocal?timeout=10s\": dial tcp 172.24.4.166:6443: connect: connection refused" interval="800ms" Mar 20 22:04:42.811796 systemd[1]: Started cri-containerd-11028ca05b9e5453938c55e5b694a67cf3d2b0e54d6d4a850aefdfa80444b9cf.scope - libcontainer container 11028ca05b9e5453938c55e5b694a67cf3d2b0e54d6d4a850aefdfa80444b9cf. Mar 20 22:04:42.866868 containerd[1483]: time="2025-03-20T22:04:42.866679413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal,Uid:14049434aee82e13cb5f856d2f449e31,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f54a6863b3813db790d85e24fd385576defa21e478c67a7ebf67e8987eda4c4\"" Mar 20 22:04:42.871121 containerd[1483]: time="2025-03-20T22:04:42.870318510Z" level=info msg="CreateContainer within sandbox \"9f54a6863b3813db790d85e24fd385576defa21e478c67a7ebf67e8987eda4c4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 20 22:04:42.877148 containerd[1483]: time="2025-03-20T22:04:42.877101725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal,Uid:5821a8a65f5a893df1d692ff4b8f78da,Namespace:kube-system,Attempt:0,} returns sandbox id \"8bf4f7b430cf1a576df372e71f754681320d49ad60dc8b951880142dd428b13c\"" Mar 20 22:04:42.885853 containerd[1483]: time="2025-03-20T22:04:42.885744338Z" level=info msg="CreateContainer within sandbox \"8bf4f7b430cf1a576df372e71f754681320d49ad60dc8b951880142dd428b13c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 20 22:04:42.894874 containerd[1483]: time="2025-03-20T22:04:42.894710598Z" level=info msg="Container 8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:04:42.908418 containerd[1483]: time="2025-03-20T22:04:42.908107781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal,Uid:3ddb92c09a3fa308b1d8de8377de1088,Namespace:kube-system,Attempt:0,} returns sandbox id \"11028ca05b9e5453938c55e5b694a67cf3d2b0e54d6d4a850aefdfa80444b9cf\"" Mar 20 22:04:42.914471 containerd[1483]: time="2025-03-20T22:04:42.914346034Z" level=info msg="CreateContainer within sandbox \"11028ca05b9e5453938c55e5b694a67cf3d2b0e54d6d4a850aefdfa80444b9cf\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 20 22:04:42.917278 containerd[1483]: time="2025-03-20T22:04:42.917241355Z" level=info msg="Container eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:04:42.919763 containerd[1483]: time="2025-03-20T22:04:42.919728531Z" level=info msg="CreateContainer within sandbox \"9f54a6863b3813db790d85e24fd385576defa21e478c67a7ebf67e8987eda4c4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb\"" Mar 20 22:04:42.920917 containerd[1483]: time="2025-03-20T22:04:42.920822012Z" level=info msg="StartContainer for \"8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb\"" Mar 20 22:04:42.923874 containerd[1483]: time="2025-03-20T22:04:42.923353721Z" level=info msg="connecting to shim 8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb" address="unix:///run/containerd/s/949c73e3e9920183b27778b0e2418862add13467e4c99897f4b586284421069e" protocol=ttrpc version=3 Mar 20 22:04:42.938898 containerd[1483]: time="2025-03-20T22:04:42.938861363Z" level=info msg="Container 1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:04:42.941274 containerd[1483]: time="2025-03-20T22:04:42.941232742Z" level=info msg="CreateContainer within sandbox \"8bf4f7b430cf1a576df372e71f754681320d49ad60dc8b951880142dd428b13c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d\"" Mar 20 22:04:42.941726 containerd[1483]: time="2025-03-20T22:04:42.941696392Z" level=info msg="StartContainer for \"eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d\"" Mar 20 22:04:42.943400 containerd[1483]: time="2025-03-20T22:04:42.942749237Z" level=info msg="connecting to shim eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d" address="unix:///run/containerd/s/9ffad3ebbb017f7c8dbcd833b20d48748ca4f25ee2527089e754a45800d53164" protocol=ttrpc version=3 Mar 20 22:04:42.946698 systemd[1]: Started cri-containerd-8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb.scope - libcontainer container 8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb. Mar 20 22:04:42.958783 containerd[1483]: time="2025-03-20T22:04:42.958674792Z" level=info msg="CreateContainer within sandbox \"11028ca05b9e5453938c55e5b694a67cf3d2b0e54d6d4a850aefdfa80444b9cf\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a\"" Mar 20 22:04:42.960160 containerd[1483]: time="2025-03-20T22:04:42.960122148Z" level=info msg="StartContainer for \"1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a\"" Mar 20 22:04:42.962769 kubelet[2328]: W0320 22:04:42.962706 2328 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.166:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-f-52bc1ad8d1.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.166:6443: connect: connection refused Mar 20 22:04:42.963186 kubelet[2328]: E0320 22:04:42.963139 2328 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.166:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-2-f-52bc1ad8d1.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:42.967073 containerd[1483]: time="2025-03-20T22:04:42.966931692Z" level=info msg="connecting to shim 1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a" address="unix:///run/containerd/s/f704f4aa5de6f70cdb2e4252d9561edcea1eec7f34ddde782045337fe851c8b6" protocol=ttrpc version=3 Mar 20 22:04:42.970653 kubelet[2328]: I0320 22:04:42.970161 2328 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.970653 kubelet[2328]: E0320 22:04:42.970449 2328 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.166:6443/api/v1/nodes\": dial tcp 172.24.4.166:6443: connect: connection refused" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:42.974866 systemd[1]: Started cri-containerd-eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d.scope - libcontainer container eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d. Mar 20 22:04:43.003802 systemd[1]: Started cri-containerd-1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a.scope - libcontainer container 1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a. Mar 20 22:04:43.027347 kubelet[2328]: W0320 22:04:43.027114 2328 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.166:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.166:6443: connect: connection refused Mar 20 22:04:43.027347 kubelet[2328]: E0320 22:04:43.027191 2328 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.166:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.166:6443: connect: connection refused" logger="UnhandledError" Mar 20 22:04:43.053960 containerd[1483]: time="2025-03-20T22:04:43.053925187Z" level=info msg="StartContainer for \"8a5fe6af41b3ec5ef43abfa0b2788c24f9969681dbeec094ec9d0831c8d8eceb\" returns successfully" Mar 20 22:04:43.069760 containerd[1483]: time="2025-03-20T22:04:43.069719014Z" level=info msg="StartContainer for \"eaf948cbb58c8027df0d99ed65763d36df430adb397f2adf6c08b2239f17cd9d\" returns successfully" Mar 20 22:04:43.109733 containerd[1483]: time="2025-03-20T22:04:43.107773773Z" level=info msg="StartContainer for \"1cbf8db31e91cc8f4af71c62fe547b318395ac8a2e3a023793d625cfb078fd9a\" returns successfully" Mar 20 22:04:43.772388 kubelet[2328]: I0320 22:04:43.772152 2328 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:45.159174 kubelet[2328]: E0320 22:04:45.159130 2328 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-9999-0-2-f-52bc1ad8d1.novalocal\" not found" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:45.266724 kubelet[2328]: I0320 22:04:45.266452 2328 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:45.266724 kubelet[2328]: E0320 22:04:45.266510 2328 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-9999-0-2-f-52bc1ad8d1.novalocal\": node \"ci-9999-0-2-f-52bc1ad8d1.novalocal\" not found" Mar 20 22:04:45.293651 kubelet[2328]: E0320 22:04:45.292242 2328 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-2-f-52bc1ad8d1.novalocal\" not found" Mar 20 22:04:46.136208 kubelet[2328]: I0320 22:04:46.136150 2328 apiserver.go:52] "Watching apiserver" Mar 20 22:04:46.178834 kubelet[2328]: I0320 22:04:46.178763 2328 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 22:04:47.517984 systemd[1]: Reload requested from client PID 2596 ('systemctl') (unit session-11.scope)... Mar 20 22:04:47.518499 systemd[1]: Reloading... Mar 20 22:04:47.647672 zram_generator::config[2642]: No configuration found. Mar 20 22:04:47.793723 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 22:04:47.929326 systemd[1]: Reloading finished in 410 ms. Mar 20 22:04:47.953988 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:47.965487 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 22:04:47.965736 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:47.965787 systemd[1]: kubelet.service: Consumed 1.218s CPU time, 120.9M memory peak. Mar 20 22:04:47.967952 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 22:04:48.287107 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 22:04:48.298874 (kubelet)[2706]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 22:04:48.348894 kubelet[2706]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:04:48.348894 kubelet[2706]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 22:04:48.348894 kubelet[2706]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 22:04:48.349237 kubelet[2706]: I0320 22:04:48.348981 2706 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 22:04:48.356646 kubelet[2706]: I0320 22:04:48.355587 2706 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 20 22:04:48.356646 kubelet[2706]: I0320 22:04:48.355612 2706 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 22:04:48.356646 kubelet[2706]: I0320 22:04:48.355906 2706 server.go:929] "Client rotation is on, will bootstrap in background" Mar 20 22:04:48.357549 kubelet[2706]: I0320 22:04:48.357527 2706 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 22:04:48.362039 kubelet[2706]: I0320 22:04:48.361972 2706 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 22:04:48.366435 kubelet[2706]: I0320 22:04:48.366413 2706 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 22:04:48.370699 kubelet[2706]: I0320 22:04:48.370670 2706 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 22:04:48.370947 kubelet[2706]: I0320 22:04:48.370936 2706 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 22:04:48.371436 kubelet[2706]: I0320 22:04:48.371105 2706 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 22:04:48.371436 kubelet[2706]: I0320 22:04:48.371138 2706 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-2-f-52bc1ad8d1.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 22:04:48.371436 kubelet[2706]: I0320 22:04:48.371316 2706 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 22:04:48.371436 kubelet[2706]: I0320 22:04:48.371327 2706 container_manager_linux.go:300] "Creating device plugin manager" Mar 20 22:04:48.371641 kubelet[2706]: I0320 22:04:48.371358 2706 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:04:48.371720 kubelet[2706]: I0320 22:04:48.371709 2706 kubelet.go:408] "Attempting to sync node with API server" Mar 20 22:04:48.372154 kubelet[2706]: I0320 22:04:48.372142 2706 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 22:04:48.372247 kubelet[2706]: I0320 22:04:48.372237 2706 kubelet.go:314] "Adding apiserver pod source" Mar 20 22:04:48.372330 kubelet[2706]: I0320 22:04:48.372320 2706 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 22:04:48.383943 kubelet[2706]: I0320 22:04:48.383918 2706 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 22:04:48.385643 kubelet[2706]: I0320 22:04:48.384604 2706 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 22:04:48.385643 kubelet[2706]: I0320 22:04:48.385069 2706 server.go:1269] "Started kubelet" Mar 20 22:04:48.393249 kubelet[2706]: I0320 22:04:48.393232 2706 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 22:04:48.402955 kubelet[2706]: I0320 22:04:48.402897 2706 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 22:04:48.405112 kubelet[2706]: I0320 22:04:48.405093 2706 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 20 22:04:48.408642 kubelet[2706]: I0320 22:04:48.406478 2706 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 22:04:48.408925 kubelet[2706]: E0320 22:04:48.408907 2706 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-0-2-f-52bc1ad8d1.novalocal\" not found" Mar 20 22:04:48.409001 kubelet[2706]: I0320 22:04:48.406948 2706 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 22:04:48.409223 kubelet[2706]: I0320 22:04:48.409209 2706 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 22:04:48.412105 kubelet[2706]: I0320 22:04:48.412088 2706 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 22:04:48.412181 kubelet[2706]: I0320 22:04:48.406906 2706 server.go:460] "Adding debug handlers to kubelet server" Mar 20 22:04:48.413113 kubelet[2706]: I0320 22:04:48.413100 2706 reconciler.go:26] "Reconciler: start to sync state" Mar 20 22:04:48.416207 kubelet[2706]: I0320 22:04:48.416086 2706 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 22:04:48.419396 kubelet[2706]: I0320 22:04:48.419366 2706 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 22:04:48.421077 kubelet[2706]: I0320 22:04:48.420604 2706 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 22:04:48.421077 kubelet[2706]: I0320 22:04:48.420644 2706 kubelet.go:2321] "Starting kubelet main sync loop" Mar 20 22:04:48.421077 kubelet[2706]: E0320 22:04:48.420685 2706 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 22:04:48.428160 kubelet[2706]: I0320 22:04:48.427825 2706 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 22:04:48.429745 kubelet[2706]: E0320 22:04:48.429609 2706 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 22:04:48.430707 kubelet[2706]: I0320 22:04:48.430686 2706 factory.go:221] Registration of the containerd container factory successfully Mar 20 22:04:48.431139 kubelet[2706]: I0320 22:04:48.431128 2706 factory.go:221] Registration of the systemd container factory successfully Mar 20 22:04:48.501645 kubelet[2706]: I0320 22:04:48.501331 2706 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 22:04:48.501645 kubelet[2706]: I0320 22:04:48.501349 2706 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 22:04:48.501645 kubelet[2706]: I0320 22:04:48.501366 2706 state_mem.go:36] "Initialized new in-memory state store" Mar 20 22:04:48.501645 kubelet[2706]: I0320 22:04:48.501537 2706 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 22:04:48.501645 kubelet[2706]: I0320 22:04:48.501550 2706 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 22:04:48.501932 kubelet[2706]: I0320 22:04:48.501918 2706 policy_none.go:49] "None policy: Start" Mar 20 22:04:48.503281 kubelet[2706]: I0320 22:04:48.503247 2706 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 22:04:48.503281 kubelet[2706]: I0320 22:04:48.503278 2706 state_mem.go:35] "Initializing new in-memory state store" Mar 20 22:04:48.503515 kubelet[2706]: I0320 22:04:48.503487 2706 state_mem.go:75] "Updated machine memory state" Mar 20 22:04:48.510923 kubelet[2706]: I0320 22:04:48.510901 2706 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 22:04:48.511093 kubelet[2706]: I0320 22:04:48.511069 2706 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 22:04:48.511127 kubelet[2706]: I0320 22:04:48.511086 2706 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 22:04:48.511731 kubelet[2706]: I0320 22:04:48.511577 2706 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 22:04:48.533566 kubelet[2706]: W0320 22:04:48.533530 2706 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:04:48.535078 kubelet[2706]: W0320 22:04:48.535032 2706 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:04:48.537709 kubelet[2706]: W0320 22:04:48.537273 2706 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:04:48.614773 kubelet[2706]: I0320 22:04:48.614719 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3ddb92c09a3fa308b1d8de8377de1088-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"3ddb92c09a3fa308b1d8de8377de1088\") " pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.614773 kubelet[2706]: I0320 22:04:48.614768 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-ca-certs\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.614953 kubelet[2706]: I0320 22:04:48.614798 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.614953 kubelet[2706]: I0320 22:04:48.614820 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3ddb92c09a3fa308b1d8de8377de1088-ca-certs\") pod \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"3ddb92c09a3fa308b1d8de8377de1088\") " pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.614953 kubelet[2706]: I0320 22:04:48.614839 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3ddb92c09a3fa308b1d8de8377de1088-k8s-certs\") pod \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"3ddb92c09a3fa308b1d8de8377de1088\") " pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.614953 kubelet[2706]: I0320 22:04:48.614857 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.614953 kubelet[2706]: I0320 22:04:48.614874 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.615111 kubelet[2706]: I0320 22:04:48.614899 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/14049434aee82e13cb5f856d2f449e31-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"14049434aee82e13cb5f856d2f449e31\") " pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.615111 kubelet[2706]: I0320 22:04:48.614921 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5821a8a65f5a893df1d692ff4b8f78da-kubeconfig\") pod \"kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal\" (UID: \"5821a8a65f5a893df1d692ff4b8f78da\") " pod="kube-system/kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.620992 kubelet[2706]: I0320 22:04:48.619817 2706 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.635826 kubelet[2706]: I0320 22:04:48.634875 2706 kubelet_node_status.go:111] "Node was previously registered" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:48.635826 kubelet[2706]: I0320 22:04:48.634953 2706 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:49.374683 kubelet[2706]: I0320 22:04:49.374578 2706 apiserver.go:52] "Watching apiserver" Mar 20 22:04:49.413842 kubelet[2706]: I0320 22:04:49.413306 2706 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 22:04:49.466452 kubelet[2706]: W0320 22:04:49.466409 2706 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 22:04:49.466581 kubelet[2706]: E0320 22:04:49.466462 2706 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:04:49.512128 kubelet[2706]: I0320 22:04:49.512071 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-0-2-f-52bc1ad8d1.novalocal" podStartSLOduration=1.512056354 podStartE2EDuration="1.512056354s" podCreationTimestamp="2025-03-20 22:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:04:49.509010501 +0000 UTC m=+1.204894907" watchObservedRunningTime="2025-03-20 22:04:49.512056354 +0000 UTC m=+1.207940770" Mar 20 22:04:49.556416 kubelet[2706]: I0320 22:04:49.556351 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-0-2-f-52bc1ad8d1.novalocal" podStartSLOduration=1.55633394 podStartE2EDuration="1.55633394s" podCreationTimestamp="2025-03-20 22:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:04:49.531762228 +0000 UTC m=+1.227646644" watchObservedRunningTime="2025-03-20 22:04:49.55633394 +0000 UTC m=+1.252218356" Mar 20 22:04:49.581614 kubelet[2706]: I0320 22:04:49.581554 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-0-2-f-52bc1ad8d1.novalocal" podStartSLOduration=1.58151759 podStartE2EDuration="1.58151759s" podCreationTimestamp="2025-03-20 22:04:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:04:49.558940249 +0000 UTC m=+1.254824655" watchObservedRunningTime="2025-03-20 22:04:49.58151759 +0000 UTC m=+1.277401996" Mar 20 22:04:52.630536 kubelet[2706]: I0320 22:04:52.630481 2706 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 20 22:04:52.633402 containerd[1483]: time="2025-03-20T22:04:52.632896575Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 20 22:04:52.634816 kubelet[2706]: I0320 22:04:52.634637 2706 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 20 22:04:53.573057 systemd[1]: Created slice kubepods-besteffort-podad9a523c_0465_4d4f_b9fe_2190739bb436.slice - libcontainer container kubepods-besteffort-podad9a523c_0465_4d4f_b9fe_2190739bb436.slice. Mar 20 22:04:53.646497 kubelet[2706]: I0320 22:04:53.645902 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ad9a523c-0465-4d4f-b9fe-2190739bb436-kube-proxy\") pod \"kube-proxy-qrk6s\" (UID: \"ad9a523c-0465-4d4f-b9fe-2190739bb436\") " pod="kube-system/kube-proxy-qrk6s" Mar 20 22:04:53.646497 kubelet[2706]: I0320 22:04:53.646091 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad9a523c-0465-4d4f-b9fe-2190739bb436-xtables-lock\") pod \"kube-proxy-qrk6s\" (UID: \"ad9a523c-0465-4d4f-b9fe-2190739bb436\") " pod="kube-system/kube-proxy-qrk6s" Mar 20 22:04:53.646497 kubelet[2706]: I0320 22:04:53.646174 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad9a523c-0465-4d4f-b9fe-2190739bb436-lib-modules\") pod \"kube-proxy-qrk6s\" (UID: \"ad9a523c-0465-4d4f-b9fe-2190739bb436\") " pod="kube-system/kube-proxy-qrk6s" Mar 20 22:04:53.646497 kubelet[2706]: I0320 22:04:53.646226 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bskj8\" (UniqueName: \"kubernetes.io/projected/ad9a523c-0465-4d4f-b9fe-2190739bb436-kube-api-access-bskj8\") pod \"kube-proxy-qrk6s\" (UID: \"ad9a523c-0465-4d4f-b9fe-2190739bb436\") " pod="kube-system/kube-proxy-qrk6s" Mar 20 22:04:53.787328 systemd[1]: Created slice kubepods-besteffort-pod0ea17c11_92b8_4f26_9cbd_c7d03a6a5efb.slice - libcontainer container kubepods-besteffort-pod0ea17c11_92b8_4f26_9cbd_c7d03a6a5efb.slice. Mar 20 22:04:53.849231 kubelet[2706]: I0320 22:04:53.848808 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddv8p\" (UniqueName: \"kubernetes.io/projected/0ea17c11-92b8-4f26-9cbd-c7d03a6a5efb-kube-api-access-ddv8p\") pod \"tigera-operator-64ff5465b7-9sl4r\" (UID: \"0ea17c11-92b8-4f26-9cbd-c7d03a6a5efb\") " pod="tigera-operator/tigera-operator-64ff5465b7-9sl4r" Mar 20 22:04:53.850110 kubelet[2706]: I0320 22:04:53.849616 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0ea17c11-92b8-4f26-9cbd-c7d03a6a5efb-var-lib-calico\") pod \"tigera-operator-64ff5465b7-9sl4r\" (UID: \"0ea17c11-92b8-4f26-9cbd-c7d03a6a5efb\") " pod="tigera-operator/tigera-operator-64ff5465b7-9sl4r" Mar 20 22:04:53.889084 containerd[1483]: time="2025-03-20T22:04:53.888764533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qrk6s,Uid:ad9a523c-0465-4d4f-b9fe-2190739bb436,Namespace:kube-system,Attempt:0,}" Mar 20 22:04:53.935279 containerd[1483]: time="2025-03-20T22:04:53.934366459Z" level=info msg="connecting to shim 403875efc0c8395eef74190918dd385f53c0e518998bfea7ea2cd0373c4894de" address="unix:///run/containerd/s/2fcfa9efa3ed17e5e8c8be1a61e4f0c393c7a603de43a08425168ddc1f27e3d4" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:04:53.993794 systemd[1]: Started cri-containerd-403875efc0c8395eef74190918dd385f53c0e518998bfea7ea2cd0373c4894de.scope - libcontainer container 403875efc0c8395eef74190918dd385f53c0e518998bfea7ea2cd0373c4894de. Mar 20 22:04:54.024112 containerd[1483]: time="2025-03-20T22:04:54.023984979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qrk6s,Uid:ad9a523c-0465-4d4f-b9fe-2190739bb436,Namespace:kube-system,Attempt:0,} returns sandbox id \"403875efc0c8395eef74190918dd385f53c0e518998bfea7ea2cd0373c4894de\"" Mar 20 22:04:54.027841 containerd[1483]: time="2025-03-20T22:04:54.027577858Z" level=info msg="CreateContainer within sandbox \"403875efc0c8395eef74190918dd385f53c0e518998bfea7ea2cd0373c4894de\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 20 22:04:54.042177 containerd[1483]: time="2025-03-20T22:04:54.041001417Z" level=info msg="Container e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:04:54.052101 containerd[1483]: time="2025-03-20T22:04:54.052035914Z" level=info msg="CreateContainer within sandbox \"403875efc0c8395eef74190918dd385f53c0e518998bfea7ea2cd0373c4894de\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3\"" Mar 20 22:04:54.052886 containerd[1483]: time="2025-03-20T22:04:54.052859349Z" level=info msg="StartContainer for \"e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3\"" Mar 20 22:04:54.054985 containerd[1483]: time="2025-03-20T22:04:54.054960591Z" level=info msg="connecting to shim e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3" address="unix:///run/containerd/s/2fcfa9efa3ed17e5e8c8be1a61e4f0c393c7a603de43a08425168ddc1f27e3d4" protocol=ttrpc version=3 Mar 20 22:04:54.077786 systemd[1]: Started cri-containerd-e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3.scope - libcontainer container e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3. Mar 20 22:04:54.093280 containerd[1483]: time="2025-03-20T22:04:54.093236265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-9sl4r,Uid:0ea17c11-92b8-4f26-9cbd-c7d03a6a5efb,Namespace:tigera-operator,Attempt:0,}" Mar 20 22:04:54.118614 containerd[1483]: time="2025-03-20T22:04:54.117978363Z" level=info msg="connecting to shim de87b5c38f355b57eb81e7c47fe2d8b2dc3d3bd25be9b7a833b76362d3f3cc23" address="unix:///run/containerd/s/0e716ab086ef8c4caa658e2c1b5b67eabe3b1f7e98a5944f1562b476edef74dd" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:04:54.135605 containerd[1483]: time="2025-03-20T22:04:54.135566734Z" level=info msg="StartContainer for \"e521ef8fe261f7249442f049865205f83d649f8b776da76477ac2545132c81b3\" returns successfully" Mar 20 22:04:54.153314 systemd[1]: Started cri-containerd-de87b5c38f355b57eb81e7c47fe2d8b2dc3d3bd25be9b7a833b76362d3f3cc23.scope - libcontainer container de87b5c38f355b57eb81e7c47fe2d8b2dc3d3bd25be9b7a833b76362d3f3cc23. Mar 20 22:04:54.233092 containerd[1483]: time="2025-03-20T22:04:54.233043181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-9sl4r,Uid:0ea17c11-92b8-4f26-9cbd-c7d03a6a5efb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"de87b5c38f355b57eb81e7c47fe2d8b2dc3d3bd25be9b7a833b76362d3f3cc23\"" Mar 20 22:04:54.235707 containerd[1483]: time="2025-03-20T22:04:54.235404802Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 20 22:04:54.335262 sudo[1754]: pam_unix(sudo:session): session closed for user root Mar 20 22:04:54.479764 sshd[1753]: Connection closed by 172.24.4.1 port 39198 Mar 20 22:04:54.482128 sshd-session[1750]: pam_unix(sshd:session): session closed for user core Mar 20 22:04:54.492198 systemd[1]: sshd@8-172.24.4.166:22-172.24.4.1:39198.service: Deactivated successfully. Mar 20 22:04:54.497564 systemd[1]: session-11.scope: Deactivated successfully. Mar 20 22:04:54.498800 kubelet[2706]: I0320 22:04:54.498524 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qrk6s" podStartSLOduration=1.498494522 podStartE2EDuration="1.498494522s" podCreationTimestamp="2025-03-20 22:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:04:54.496558821 +0000 UTC m=+6.192443277" watchObservedRunningTime="2025-03-20 22:04:54.498494522 +0000 UTC m=+6.194378958" Mar 20 22:04:54.498607 systemd[1]: session-11.scope: Consumed 6.981s CPU time, 226.2M memory peak. Mar 20 22:04:54.501294 systemd-logind[1458]: Session 11 logged out. Waiting for processes to exit. Mar 20 22:04:54.504255 systemd-logind[1458]: Removed session 11. Mar 20 22:04:56.719263 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount547864546.mount: Deactivated successfully. Mar 20 22:04:57.306111 containerd[1483]: time="2025-03-20T22:04:57.306058471Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:57.307198 containerd[1483]: time="2025-03-20T22:04:57.307107869Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 20 22:04:57.308352 containerd[1483]: time="2025-03-20T22:04:57.308317868Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:57.311478 containerd[1483]: time="2025-03-20T22:04:57.311416982Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:04:57.312176 containerd[1483]: time="2025-03-20T22:04:57.312048857Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 3.076613318s" Mar 20 22:04:57.312176 containerd[1483]: time="2025-03-20T22:04:57.312083542Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 20 22:04:57.317727 containerd[1483]: time="2025-03-20T22:04:57.317552009Z" level=info msg="CreateContainer within sandbox \"de87b5c38f355b57eb81e7c47fe2d8b2dc3d3bd25be9b7a833b76362d3f3cc23\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 20 22:04:57.331264 containerd[1483]: time="2025-03-20T22:04:57.329583576Z" level=info msg="Container 2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:04:57.334759 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2169922999.mount: Deactivated successfully. Mar 20 22:04:57.340781 containerd[1483]: time="2025-03-20T22:04:57.340732317Z" level=info msg="CreateContainer within sandbox \"de87b5c38f355b57eb81e7c47fe2d8b2dc3d3bd25be9b7a833b76362d3f3cc23\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318\"" Mar 20 22:04:57.342316 containerd[1483]: time="2025-03-20T22:04:57.341287919Z" level=info msg="StartContainer for \"2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318\"" Mar 20 22:04:57.342316 containerd[1483]: time="2025-03-20T22:04:57.342085125Z" level=info msg="connecting to shim 2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318" address="unix:///run/containerd/s/0e716ab086ef8c4caa658e2c1b5b67eabe3b1f7e98a5944f1562b476edef74dd" protocol=ttrpc version=3 Mar 20 22:04:57.368956 systemd[1]: Started cri-containerd-2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318.scope - libcontainer container 2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318. Mar 20 22:04:57.406202 containerd[1483]: time="2025-03-20T22:04:57.406166230Z" level=info msg="StartContainer for \"2b6f0a89d435633f5371b977c892807b54dd3ddd38bf6008431701580e5b2318\" returns successfully" Mar 20 22:04:57.500543 kubelet[2706]: I0320 22:04:57.500370 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-9sl4r" podStartSLOduration=1.420398479 podStartE2EDuration="4.500349287s" podCreationTimestamp="2025-03-20 22:04:53 +0000 UTC" firstStartedPulling="2025-03-20 22:04:54.234163312 +0000 UTC m=+5.930047718" lastFinishedPulling="2025-03-20 22:04:57.31411412 +0000 UTC m=+9.009998526" observedRunningTime="2025-03-20 22:04:57.499834902 +0000 UTC m=+9.195719318" watchObservedRunningTime="2025-03-20 22:04:57.500349287 +0000 UTC m=+9.196233693" Mar 20 22:05:00.681527 systemd[1]: Created slice kubepods-besteffort-pod4e2b7c02_da9a_45df_9a75_affe44fa3c51.slice - libcontainer container kubepods-besteffort-pod4e2b7c02_da9a_45df_9a75_affe44fa3c51.slice. Mar 20 22:05:00.700096 kubelet[2706]: I0320 22:05:00.700038 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2b7c02-da9a-45df-9a75-affe44fa3c51-tigera-ca-bundle\") pod \"calico-typha-6f9cf7597f-f57j7\" (UID: \"4e2b7c02-da9a-45df-9a75-affe44fa3c51\") " pod="calico-system/calico-typha-6f9cf7597f-f57j7" Mar 20 22:05:00.700519 kubelet[2706]: I0320 22:05:00.700111 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxzvr\" (UniqueName: \"kubernetes.io/projected/4e2b7c02-da9a-45df-9a75-affe44fa3c51-kube-api-access-fxzvr\") pod \"calico-typha-6f9cf7597f-f57j7\" (UID: \"4e2b7c02-da9a-45df-9a75-affe44fa3c51\") " pod="calico-system/calico-typha-6f9cf7597f-f57j7" Mar 20 22:05:00.700519 kubelet[2706]: I0320 22:05:00.700151 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4e2b7c02-da9a-45df-9a75-affe44fa3c51-typha-certs\") pod \"calico-typha-6f9cf7597f-f57j7\" (UID: \"4e2b7c02-da9a-45df-9a75-affe44fa3c51\") " pod="calico-system/calico-typha-6f9cf7597f-f57j7" Mar 20 22:05:00.852137 systemd[1]: Created slice kubepods-besteffort-pode91b84c8_3cf6_4c26_a022_7574aade4501.slice - libcontainer container kubepods-besteffort-pode91b84c8_3cf6_4c26_a022_7574aade4501.slice. Mar 20 22:05:00.901913 kubelet[2706]: I0320 22:05:00.901869 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e91b84c8-3cf6-4c26-a022-7574aade4501-node-certs\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902121 kubelet[2706]: I0320 22:05:00.902096 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-cni-log-dir\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902215 kubelet[2706]: I0320 22:05:00.902198 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-flexvol-driver-host\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902299 kubelet[2706]: I0320 22:05:00.902285 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e91b84c8-3cf6-4c26-a022-7574aade4501-tigera-ca-bundle\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902378 kubelet[2706]: I0320 22:05:00.902364 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-lib-modules\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902472 kubelet[2706]: I0320 22:05:00.902457 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-cni-net-dir\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902558 kubelet[2706]: I0320 22:05:00.902544 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vdtm\" (UniqueName: \"kubernetes.io/projected/e91b84c8-3cf6-4c26-a022-7574aade4501-kube-api-access-5vdtm\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902676 kubelet[2706]: I0320 22:05:00.902660 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-cni-bin-dir\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902766 kubelet[2706]: I0320 22:05:00.902752 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-xtables-lock\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.902875 kubelet[2706]: I0320 22:05:00.902854 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-var-run-calico\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.903140 kubelet[2706]: I0320 22:05:00.902954 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-policysync\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.903140 kubelet[2706]: I0320 22:05:00.902981 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e91b84c8-3cf6-4c26-a022-7574aade4501-var-lib-calico\") pod \"calico-node-h5t7c\" (UID: \"e91b84c8-3cf6-4c26-a022-7574aade4501\") " pod="calico-system/calico-node-h5t7c" Mar 20 22:05:00.957200 kubelet[2706]: E0320 22:05:00.956834 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:00.986938 containerd[1483]: time="2025-03-20T22:05:00.986892473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f9cf7597f-f57j7,Uid:4e2b7c02-da9a-45df-9a75-affe44fa3c51,Namespace:calico-system,Attempt:0,}" Mar 20 22:05:01.003369 kubelet[2706]: I0320 22:05:01.003310 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b6039ed2-4638-4c6e-9f5f-d86da6536f5a-registration-dir\") pod \"csi-node-driver-b4r6v\" (UID: \"b6039ed2-4638-4c6e-9f5f-d86da6536f5a\") " pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:01.003369 kubelet[2706]: I0320 22:05:01.003374 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b6039ed2-4638-4c6e-9f5f-d86da6536f5a-socket-dir\") pod \"csi-node-driver-b4r6v\" (UID: \"b6039ed2-4638-4c6e-9f5f-d86da6536f5a\") " pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:01.003559 kubelet[2706]: I0320 22:05:01.003519 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4zgh\" (UniqueName: \"kubernetes.io/projected/b6039ed2-4638-4c6e-9f5f-d86da6536f5a-kube-api-access-s4zgh\") pod \"csi-node-driver-b4r6v\" (UID: \"b6039ed2-4638-4c6e-9f5f-d86da6536f5a\") " pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:01.003595 kubelet[2706]: I0320 22:05:01.003569 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b6039ed2-4638-4c6e-9f5f-d86da6536f5a-kubelet-dir\") pod \"csi-node-driver-b4r6v\" (UID: \"b6039ed2-4638-4c6e-9f5f-d86da6536f5a\") " pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:01.003648 kubelet[2706]: I0320 22:05:01.003593 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b6039ed2-4638-4c6e-9f5f-d86da6536f5a-varrun\") pod \"csi-node-driver-b4r6v\" (UID: \"b6039ed2-4638-4c6e-9f5f-d86da6536f5a\") " pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:01.022931 kubelet[2706]: E0320 22:05:01.020328 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.022931 kubelet[2706]: W0320 22:05:01.020352 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.022931 kubelet[2706]: E0320 22:05:01.020376 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.039711 containerd[1483]: time="2025-03-20T22:05:01.039046555Z" level=info msg="connecting to shim 6e8559f67cbcd9613915b25e4b5f181197cf7ef1b6302e72b40b34f1388216be" address="unix:///run/containerd/s/c85b300fe050485915c6491c31e296d4afc040a2823610eb553378c3ecfd01d2" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:01.065866 kubelet[2706]: E0320 22:05:01.065824 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.065866 kubelet[2706]: W0320 22:05:01.065851 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.065866 kubelet[2706]: E0320 22:05:01.065872 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.074675 systemd[1]: Started cri-containerd-6e8559f67cbcd9613915b25e4b5f181197cf7ef1b6302e72b40b34f1388216be.scope - libcontainer container 6e8559f67cbcd9613915b25e4b5f181197cf7ef1b6302e72b40b34f1388216be. Mar 20 22:05:01.104782 kubelet[2706]: E0320 22:05:01.104667 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.104782 kubelet[2706]: W0320 22:05:01.104691 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.104782 kubelet[2706]: E0320 22:05:01.104712 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.105254 kubelet[2706]: E0320 22:05:01.105100 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.105254 kubelet[2706]: W0320 22:05:01.105115 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.105254 kubelet[2706]: E0320 22:05:01.105133 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.105583 kubelet[2706]: E0320 22:05:01.105449 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.105583 kubelet[2706]: W0320 22:05:01.105477 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.105583 kubelet[2706]: E0320 22:05:01.105523 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.105956 kubelet[2706]: E0320 22:05:01.105886 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.105956 kubelet[2706]: W0320 22:05:01.105897 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.105956 kubelet[2706]: E0320 22:05:01.105913 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.106727 kubelet[2706]: E0320 22:05:01.106457 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.106778 kubelet[2706]: W0320 22:05:01.106722 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.106778 kubelet[2706]: E0320 22:05:01.106754 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.107111 kubelet[2706]: E0320 22:05:01.107093 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.107111 kubelet[2706]: W0320 22:05:01.107107 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.107244 kubelet[2706]: E0320 22:05:01.107222 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.107651 kubelet[2706]: E0320 22:05:01.107610 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.107759 kubelet[2706]: W0320 22:05:01.107742 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.107886 kubelet[2706]: E0320 22:05:01.107814 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.108281 kubelet[2706]: E0320 22:05:01.108252 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.108281 kubelet[2706]: W0320 22:05:01.108279 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.108408 kubelet[2706]: E0320 22:05:01.108342 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.108954 kubelet[2706]: E0320 22:05:01.108934 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.108954 kubelet[2706]: W0320 22:05:01.108952 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.109113 kubelet[2706]: E0320 22:05:01.109014 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.109725 kubelet[2706]: E0320 22:05:01.109693 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.109725 kubelet[2706]: W0320 22:05:01.109707 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.109880 kubelet[2706]: E0320 22:05:01.109788 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.110148 kubelet[2706]: E0320 22:05:01.110126 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.110148 kubelet[2706]: W0320 22:05:01.110139 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.110298 kubelet[2706]: E0320 22:05:01.110206 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.110739 kubelet[2706]: E0320 22:05:01.110720 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.110739 kubelet[2706]: W0320 22:05:01.110736 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.110935 kubelet[2706]: E0320 22:05:01.110915 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.111275 kubelet[2706]: E0320 22:05:01.111261 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.111275 kubelet[2706]: W0320 22:05:01.111274 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.111429 kubelet[2706]: E0320 22:05:01.111342 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.111918 kubelet[2706]: E0320 22:05:01.111902 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.111918 kubelet[2706]: W0320 22:05:01.111916 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.112064 kubelet[2706]: E0320 22:05:01.111975 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.112773 kubelet[2706]: E0320 22:05:01.112725 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.112773 kubelet[2706]: W0320 22:05:01.112748 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.112974 kubelet[2706]: E0320 22:05:01.112860 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.113283 kubelet[2706]: E0320 22:05:01.113259 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.113421 kubelet[2706]: W0320 22:05:01.113402 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.113421 kubelet[2706]: E0320 22:05:01.113465 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.114440 kubelet[2706]: E0320 22:05:01.114423 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.114440 kubelet[2706]: W0320 22:05:01.114438 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.114721 kubelet[2706]: E0320 22:05:01.114495 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.115221 kubelet[2706]: E0320 22:05:01.115162 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.115221 kubelet[2706]: W0320 22:05:01.115174 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.115221 kubelet[2706]: E0320 22:05:01.115210 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.115617 kubelet[2706]: E0320 22:05:01.115550 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.115617 kubelet[2706]: W0320 22:05:01.115562 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.115978 kubelet[2706]: E0320 22:05:01.115863 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.115978 kubelet[2706]: W0320 22:05:01.115874 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.116191 kubelet[2706]: E0320 22:05:01.116170 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.116354 kubelet[2706]: W0320 22:05:01.116241 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.116534 kubelet[2706]: E0320 22:05:01.116512 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.116735 kubelet[2706]: W0320 22:05:01.116594 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.116735 kubelet[2706]: E0320 22:05:01.116633 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.116735 kubelet[2706]: E0320 22:05:01.116650 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.116735 kubelet[2706]: E0320 22:05:01.116661 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.116735 kubelet[2706]: E0320 22:05:01.116653 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.117160 kubelet[2706]: E0320 22:05:01.117035 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.117160 kubelet[2706]: W0320 22:05:01.117047 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.117160 kubelet[2706]: E0320 22:05:01.117074 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.117553 kubelet[2706]: E0320 22:05:01.117443 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.117553 kubelet[2706]: W0320 22:05:01.117454 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.117553 kubelet[2706]: E0320 22:05:01.117482 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.118092 kubelet[2706]: E0320 22:05:01.117849 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.118092 kubelet[2706]: W0320 22:05:01.117874 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.118092 kubelet[2706]: E0320 22:05:01.117884 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.134240 kubelet[2706]: E0320 22:05:01.134219 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:01.134398 kubelet[2706]: W0320 22:05:01.134350 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:01.134398 kubelet[2706]: E0320 22:05:01.134371 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:01.156386 containerd[1483]: time="2025-03-20T22:05:01.156242762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h5t7c,Uid:e91b84c8-3cf6-4c26-a022-7574aade4501,Namespace:calico-system,Attempt:0,}" Mar 20 22:05:01.169395 containerd[1483]: time="2025-03-20T22:05:01.169339487Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f9cf7597f-f57j7,Uid:4e2b7c02-da9a-45df-9a75-affe44fa3c51,Namespace:calico-system,Attempt:0,} returns sandbox id \"6e8559f67cbcd9613915b25e4b5f181197cf7ef1b6302e72b40b34f1388216be\"" Mar 20 22:05:01.172168 containerd[1483]: time="2025-03-20T22:05:01.171865104Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 20 22:05:01.194802 containerd[1483]: time="2025-03-20T22:05:01.194669957Z" level=info msg="connecting to shim bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df" address="unix:///run/containerd/s/0f50ea2d63e850ac82c365b63399c61ef906eecef9a3e945fb45866be88c3210" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:01.221871 systemd[1]: Started cri-containerd-bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df.scope - libcontainer container bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df. Mar 20 22:05:01.271046 containerd[1483]: time="2025-03-20T22:05:01.270997951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h5t7c,Uid:e91b84c8-3cf6-4c26-a022-7574aade4501,Namespace:calico-system,Attempt:0,} returns sandbox id \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\"" Mar 20 22:05:02.904372 kubelet[2706]: E0320 22:05:02.904306 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.904372 kubelet[2706]: W0320 22:05:02.904328 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.904372 kubelet[2706]: E0320 22:05:02.904348 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.904552 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.906187 kubelet[2706]: W0320 22:05:02.904561 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.904570 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.904743 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.906187 kubelet[2706]: W0320 22:05:02.904752 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.904761 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.904909 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.906187 kubelet[2706]: W0320 22:05:02.904918 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.904926 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.906187 kubelet[2706]: E0320 22:05:02.905079 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.907494 kubelet[2706]: W0320 22:05:02.905087 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.907494 kubelet[2706]: E0320 22:05:02.905095 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.907494 kubelet[2706]: E0320 22:05:02.905231 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.907494 kubelet[2706]: W0320 22:05:02.905240 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.907494 kubelet[2706]: E0320 22:05:02.905248 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.907494 kubelet[2706]: E0320 22:05:02.905387 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.907494 kubelet[2706]: W0320 22:05:02.905396 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.907494 kubelet[2706]: E0320 22:05:02.905406 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.907494 kubelet[2706]: E0320 22:05:02.905546 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.907494 kubelet[2706]: W0320 22:05:02.905554 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.905564 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.905727 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.908460 kubelet[2706]: W0320 22:05:02.905735 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.905743 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.905890 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.908460 kubelet[2706]: W0320 22:05:02.905898 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.905908 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.906042 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.908460 kubelet[2706]: W0320 22:05:02.906050 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.908460 kubelet[2706]: E0320 22:05:02.906058 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906197 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909200 kubelet[2706]: W0320 22:05:02.906206 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906215 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906361 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909200 kubelet[2706]: W0320 22:05:02.906369 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906378 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906515 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909200 kubelet[2706]: W0320 22:05:02.906523 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906532 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909200 kubelet[2706]: E0320 22:05:02.906681 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909858 kubelet[2706]: W0320 22:05:02.906691 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.909858 kubelet[2706]: E0320 22:05:02.906720 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909858 kubelet[2706]: E0320 22:05:02.906889 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909858 kubelet[2706]: W0320 22:05:02.906897 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.909858 kubelet[2706]: E0320 22:05:02.906906 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909858 kubelet[2706]: E0320 22:05:02.907060 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909858 kubelet[2706]: W0320 22:05:02.907069 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.909858 kubelet[2706]: E0320 22:05:02.907078 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.909858 kubelet[2706]: E0320 22:05:02.907227 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.909858 kubelet[2706]: W0320 22:05:02.907236 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907245 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907390 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.910407 kubelet[2706]: W0320 22:05:02.907398 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907409 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907567 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.910407 kubelet[2706]: W0320 22:05:02.907578 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907587 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907770 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.910407 kubelet[2706]: W0320 22:05:02.907779 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.910407 kubelet[2706]: E0320 22:05:02.907789 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.907943 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.911172 kubelet[2706]: W0320 22:05:02.907956 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.907965 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.908120 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.911172 kubelet[2706]: W0320 22:05:02.908130 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.908139 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.908296 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.911172 kubelet[2706]: W0320 22:05:02.908304 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.908321 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:02.911172 kubelet[2706]: E0320 22:05:02.908467 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:02.911784 kubelet[2706]: W0320 22:05:02.908476 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:02.911784 kubelet[2706]: E0320 22:05:02.908498 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:03.425265 kubelet[2706]: E0320 22:05:03.425142 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:04.331857 containerd[1483]: time="2025-03-20T22:05:04.331796627Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:04.333296 containerd[1483]: time="2025-03-20T22:05:04.333144505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 20 22:05:04.334437 containerd[1483]: time="2025-03-20T22:05:04.334375595Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:04.336983 containerd[1483]: time="2025-03-20T22:05:04.336918173Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:04.337787 containerd[1483]: time="2025-03-20T22:05:04.337663371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.165758332s" Mar 20 22:05:04.337787 containerd[1483]: time="2025-03-20T22:05:04.337695111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 20 22:05:04.339387 containerd[1483]: time="2025-03-20T22:05:04.339192429Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 20 22:05:04.353315 containerd[1483]: time="2025-03-20T22:05:04.353065620Z" level=info msg="CreateContainer within sandbox \"6e8559f67cbcd9613915b25e4b5f181197cf7ef1b6302e72b40b34f1388216be\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 22:05:04.368504 containerd[1483]: time="2025-03-20T22:05:04.366592250Z" level=info msg="Container 1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:04.381278 containerd[1483]: time="2025-03-20T22:05:04.381237479Z" level=info msg="CreateContainer within sandbox \"6e8559f67cbcd9613915b25e4b5f181197cf7ef1b6302e72b40b34f1388216be\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c\"" Mar 20 22:05:04.383322 containerd[1483]: time="2025-03-20T22:05:04.381925550Z" level=info msg="StartContainer for \"1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c\"" Mar 20 22:05:04.383322 containerd[1483]: time="2025-03-20T22:05:04.383062442Z" level=info msg="connecting to shim 1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c" address="unix:///run/containerd/s/c85b300fe050485915c6491c31e296d4afc040a2823610eb553378c3ecfd01d2" protocol=ttrpc version=3 Mar 20 22:05:04.408785 systemd[1]: Started cri-containerd-1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c.scope - libcontainer container 1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c. Mar 20 22:05:04.465043 containerd[1483]: time="2025-03-20T22:05:04.464908744Z" level=info msg="StartContainer for \"1b46922250f6df593b83b98239f4cffdd98cbfd7cadb8411c2968350d0ab137c\" returns successfully" Mar 20 22:05:04.519604 kubelet[2706]: E0320 22:05:04.519562 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.519604 kubelet[2706]: W0320 22:05:04.519581 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.519604 kubelet[2706]: E0320 22:05:04.519598 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.520568 kubelet[2706]: E0320 22:05:04.519831 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.520568 kubelet[2706]: W0320 22:05:04.519840 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.520568 kubelet[2706]: E0320 22:05:04.519850 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.520568 kubelet[2706]: E0320 22:05:04.519999 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.520568 kubelet[2706]: W0320 22:05:04.520008 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.520568 kubelet[2706]: E0320 22:05:04.520017 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.520568 kubelet[2706]: E0320 22:05:04.520165 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.520568 kubelet[2706]: W0320 22:05:04.520174 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.520568 kubelet[2706]: E0320 22:05:04.520183 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.521556 kubelet[2706]: E0320 22:05:04.521519 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.521556 kubelet[2706]: W0320 22:05:04.521537 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.521556 kubelet[2706]: E0320 22:05:04.521550 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.521781 kubelet[2706]: E0320 22:05:04.521763 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.521781 kubelet[2706]: W0320 22:05:04.521777 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.521873 kubelet[2706]: E0320 22:05:04.521787 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.521958 kubelet[2706]: E0320 22:05:04.521940 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.521958 kubelet[2706]: W0320 22:05:04.521954 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.522052 kubelet[2706]: E0320 22:05:04.521963 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.522209 kubelet[2706]: E0320 22:05:04.522121 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.522209 kubelet[2706]: W0320 22:05:04.522130 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.522209 kubelet[2706]: E0320 22:05:04.522140 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.522481 kubelet[2706]: E0320 22:05:04.522290 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.522481 kubelet[2706]: W0320 22:05:04.522299 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.522481 kubelet[2706]: E0320 22:05:04.522307 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.522481 kubelet[2706]: E0320 22:05:04.522444 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.522481 kubelet[2706]: W0320 22:05:04.522453 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.522481 kubelet[2706]: E0320 22:05:04.522461 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.523041 kubelet[2706]: E0320 22:05:04.522599 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.523041 kubelet[2706]: W0320 22:05:04.522607 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.523041 kubelet[2706]: E0320 22:05:04.522615 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.523041 kubelet[2706]: E0320 22:05:04.522766 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.523041 kubelet[2706]: W0320 22:05:04.522775 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.523041 kubelet[2706]: E0320 22:05:04.522783 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.523041 kubelet[2706]: E0320 22:05:04.522973 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.523041 kubelet[2706]: W0320 22:05:04.522982 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.523041 kubelet[2706]: E0320 22:05:04.522991 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.523606 kubelet[2706]: E0320 22:05:04.523184 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.523606 kubelet[2706]: W0320 22:05:04.523193 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.523606 kubelet[2706]: E0320 22:05:04.523202 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.523774 kubelet[2706]: E0320 22:05:04.523753 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.523774 kubelet[2706]: W0320 22:05:04.523763 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.523863 kubelet[2706]: E0320 22:05:04.523793 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.549737 kubelet[2706]: E0320 22:05:04.549701 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.549737 kubelet[2706]: W0320 22:05:04.549724 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.549737 kubelet[2706]: E0320 22:05:04.549745 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.550727 kubelet[2706]: E0320 22:05:04.550689 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.550727 kubelet[2706]: W0320 22:05:04.550704 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.550727 kubelet[2706]: E0320 22:05:04.550721 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.551002 kubelet[2706]: E0320 22:05:04.550982 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.551002 kubelet[2706]: W0320 22:05:04.550997 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.551119 kubelet[2706]: E0320 22:05:04.551011 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.551284 kubelet[2706]: E0320 22:05:04.551250 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.551284 kubelet[2706]: W0320 22:05:04.551265 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.551284 kubelet[2706]: E0320 22:05:04.551279 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.551490 kubelet[2706]: E0320 22:05:04.551469 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.551490 kubelet[2706]: W0320 22:05:04.551479 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.551490 kubelet[2706]: E0320 22:05:04.551489 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.552772 kubelet[2706]: E0320 22:05:04.552696 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.552772 kubelet[2706]: W0320 22:05:04.552710 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.552972 kubelet[2706]: E0320 22:05:04.552801 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.552972 kubelet[2706]: E0320 22:05:04.552960 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.552972 kubelet[2706]: W0320 22:05:04.552969 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.553178 kubelet[2706]: E0320 22:05:04.553058 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.553220 kubelet[2706]: E0320 22:05:04.553210 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.553251 kubelet[2706]: W0320 22:05:04.553220 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.553384 kubelet[2706]: E0320 22:05:04.553307 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.553562 kubelet[2706]: E0320 22:05:04.553544 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.553562 kubelet[2706]: W0320 22:05:04.553559 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.553846 kubelet[2706]: E0320 22:05:04.553573 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.554059 kubelet[2706]: E0320 22:05:04.554011 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.554059 kubelet[2706]: W0320 22:05:04.554032 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.554230 kubelet[2706]: E0320 22:05:04.554153 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.554745 kubelet[2706]: E0320 22:05:04.554689 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.554745 kubelet[2706]: W0320 22:05:04.554701 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.554745 kubelet[2706]: E0320 22:05:04.554716 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.555228 kubelet[2706]: E0320 22:05:04.555191 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.555228 kubelet[2706]: W0320 22:05:04.555207 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.555228 kubelet[2706]: E0320 22:05:04.555222 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.555431 kubelet[2706]: E0320 22:05:04.555425 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.555470 kubelet[2706]: W0320 22:05:04.555435 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.555470 kubelet[2706]: E0320 22:05:04.555444 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.556092 kubelet[2706]: E0320 22:05:04.556048 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.556092 kubelet[2706]: W0320 22:05:04.556061 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.556225 kubelet[2706]: E0320 22:05:04.556160 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.557711 kubelet[2706]: E0320 22:05:04.557688 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.557711 kubelet[2706]: W0320 22:05:04.557704 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.557924 kubelet[2706]: E0320 22:05:04.557842 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.557924 kubelet[2706]: E0320 22:05:04.557893 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.557924 kubelet[2706]: W0320 22:05:04.557902 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.558148 kubelet[2706]: E0320 22:05:04.557997 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.558711 kubelet[2706]: E0320 22:05:04.558690 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.558711 kubelet[2706]: W0320 22:05:04.558705 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.558800 kubelet[2706]: E0320 22:05:04.558720 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:04.559902 kubelet[2706]: E0320 22:05:04.559883 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:04.559902 kubelet[2706]: W0320 22:05:04.559897 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:04.560002 kubelet[2706]: E0320 22:05:04.559908 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.421173 kubelet[2706]: E0320 22:05:05.421050 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:05.517147 kubelet[2706]: I0320 22:05:05.517119 2706 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:05:05.531199 kubelet[2706]: E0320 22:05:05.531165 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.531199 kubelet[2706]: W0320 22:05:05.531189 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.531562 kubelet[2706]: E0320 22:05:05.531211 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.531562 kubelet[2706]: E0320 22:05:05.531428 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.531562 kubelet[2706]: W0320 22:05:05.531438 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.531562 kubelet[2706]: E0320 22:05:05.531447 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.531697 kubelet[2706]: E0320 22:05:05.531608 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.531697 kubelet[2706]: W0320 22:05:05.531616 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.531697 kubelet[2706]: E0320 22:05:05.531660 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.531851 kubelet[2706]: E0320 22:05:05.531835 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.531851 kubelet[2706]: W0320 22:05:05.531847 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.531924 kubelet[2706]: E0320 22:05:05.531858 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.532053 kubelet[2706]: E0320 22:05:05.532038 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.532053 kubelet[2706]: W0320 22:05:05.532050 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.532119 kubelet[2706]: E0320 22:05:05.532059 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.532244 kubelet[2706]: E0320 22:05:05.532229 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.532244 kubelet[2706]: W0320 22:05:05.532241 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.532313 kubelet[2706]: E0320 22:05:05.532250 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.532454 kubelet[2706]: E0320 22:05:05.532439 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.532454 kubelet[2706]: W0320 22:05:05.532451 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.532536 kubelet[2706]: E0320 22:05:05.532461 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.532958 kubelet[2706]: E0320 22:05:05.532940 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.532958 kubelet[2706]: W0320 22:05:05.532954 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.533054 kubelet[2706]: E0320 22:05:05.532964 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.533173 kubelet[2706]: E0320 22:05:05.533158 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.533173 kubelet[2706]: W0320 22:05:05.533169 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.533243 kubelet[2706]: E0320 22:05:05.533178 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.533368 kubelet[2706]: E0320 22:05:05.533351 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.533368 kubelet[2706]: W0320 22:05:05.533364 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.533480 kubelet[2706]: E0320 22:05:05.533373 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.533614 kubelet[2706]: E0320 22:05:05.533539 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.533614 kubelet[2706]: W0320 22:05:05.533550 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.533614 kubelet[2706]: E0320 22:05:05.533559 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.533817 kubelet[2706]: E0320 22:05:05.533793 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.533817 kubelet[2706]: W0320 22:05:05.533817 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.533929 kubelet[2706]: E0320 22:05:05.533827 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.535249 kubelet[2706]: E0320 22:05:05.534350 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.535249 kubelet[2706]: W0320 22:05:05.534364 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.535249 kubelet[2706]: E0320 22:05:05.534393 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.535249 kubelet[2706]: E0320 22:05:05.534560 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.535249 kubelet[2706]: W0320 22:05:05.534569 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.535249 kubelet[2706]: E0320 22:05:05.534578 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.535869 kubelet[2706]: E0320 22:05:05.535703 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.535869 kubelet[2706]: W0320 22:05:05.535715 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.535869 kubelet[2706]: E0320 22:05:05.535726 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.558852 kubelet[2706]: E0320 22:05:05.558828 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.559113 kubelet[2706]: W0320 22:05:05.558987 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.559113 kubelet[2706]: E0320 22:05:05.559010 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.559343 kubelet[2706]: E0320 22:05:05.559311 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.559343 kubelet[2706]: W0320 22:05:05.559323 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.559545 kubelet[2706]: E0320 22:05:05.559526 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.559784 kubelet[2706]: E0320 22:05:05.559683 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.559784 kubelet[2706]: W0320 22:05:05.559694 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.559784 kubelet[2706]: E0320 22:05:05.559704 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.559996 kubelet[2706]: E0320 22:05:05.559961 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.559996 kubelet[2706]: W0320 22:05:05.559972 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.560126 kubelet[2706]: E0320 22:05:05.560085 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.560360 kubelet[2706]: E0320 22:05:05.560336 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.560360 kubelet[2706]: W0320 22:05:05.560348 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.560515 kubelet[2706]: E0320 22:05:05.560452 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.561107 kubelet[2706]: E0320 22:05:05.560755 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.561107 kubelet[2706]: W0320 22:05:05.560766 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.561107 kubelet[2706]: E0320 22:05:05.560781 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.561259 kubelet[2706]: E0320 22:05:05.561248 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.561327 kubelet[2706]: W0320 22:05:05.561316 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.561450 kubelet[2706]: E0320 22:05:05.561415 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.561617 kubelet[2706]: E0320 22:05:05.561605 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.561723 kubelet[2706]: W0320 22:05:05.561712 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.561830 kubelet[2706]: E0320 22:05:05.561801 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.562141 kubelet[2706]: E0320 22:05:05.561995 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.562141 kubelet[2706]: W0320 22:05:05.562016 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.562141 kubelet[2706]: E0320 22:05:05.562032 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.562997 kubelet[2706]: E0320 22:05:05.562984 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.563156 kubelet[2706]: W0320 22:05:05.563051 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.563156 kubelet[2706]: E0320 22:05:05.563075 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.563291 kubelet[2706]: E0320 22:05:05.563280 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.563356 kubelet[2706]: W0320 22:05:05.563346 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.563458 kubelet[2706]: E0320 22:05:05.563430 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.563759 kubelet[2706]: E0320 22:05:05.563648 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.563759 kubelet[2706]: W0320 22:05:05.563659 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.563759 kubelet[2706]: E0320 22:05:05.563675 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.563918 kubelet[2706]: E0320 22:05:05.563906 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.564091 kubelet[2706]: W0320 22:05:05.563971 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.564091 kubelet[2706]: E0320 22:05:05.563991 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.564229 kubelet[2706]: E0320 22:05:05.564219 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.564290 kubelet[2706]: W0320 22:05:05.564279 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.564521 kubelet[2706]: E0320 22:05:05.564339 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.564679 kubelet[2706]: E0320 22:05:05.564661 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.564723 kubelet[2706]: W0320 22:05:05.564679 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.564723 kubelet[2706]: E0320 22:05:05.564698 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.564879 kubelet[2706]: E0320 22:05:05.564861 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.564917 kubelet[2706]: W0320 22:05:05.564889 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.564975 kubelet[2706]: E0320 22:05:05.564939 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.565261 kubelet[2706]: E0320 22:05:05.565246 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.565261 kubelet[2706]: W0320 22:05:05.565260 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.565338 kubelet[2706]: E0320 22:05:05.565274 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:05.565434 kubelet[2706]: E0320 22:05:05.565421 2706 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 22:05:05.565434 kubelet[2706]: W0320 22:05:05.565433 2706 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 22:05:05.565489 kubelet[2706]: E0320 22:05:05.565442 2706 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 22:05:06.414081 containerd[1483]: time="2025-03-20T22:05:06.414039271Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:06.415449 containerd[1483]: time="2025-03-20T22:05:06.415390956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 20 22:05:06.416742 containerd[1483]: time="2025-03-20T22:05:06.416691325Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:06.419812 containerd[1483]: time="2025-03-20T22:05:06.419138465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:06.419812 containerd[1483]: time="2025-03-20T22:05:06.419705418Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 2.080482311s" Mar 20 22:05:06.419812 containerd[1483]: time="2025-03-20T22:05:06.419734432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 20 22:05:06.421579 containerd[1483]: time="2025-03-20T22:05:06.421550949Z" level=info msg="CreateContainer within sandbox \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 22:05:06.432888 containerd[1483]: time="2025-03-20T22:05:06.432855011Z" level=info msg="Container a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:06.453150 containerd[1483]: time="2025-03-20T22:05:06.453079033Z" level=info msg="CreateContainer within sandbox \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\"" Mar 20 22:05:06.454385 containerd[1483]: time="2025-03-20T22:05:06.453920792Z" level=info msg="StartContainer for \"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\"" Mar 20 22:05:06.457044 containerd[1483]: time="2025-03-20T22:05:06.456572847Z" level=info msg="connecting to shim a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156" address="unix:///run/containerd/s/0f50ea2d63e850ac82c365b63399c61ef906eecef9a3e945fb45866be88c3210" protocol=ttrpc version=3 Mar 20 22:05:06.493831 systemd[1]: Started cri-containerd-a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156.scope - libcontainer container a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156. Mar 20 22:05:06.546104 containerd[1483]: time="2025-03-20T22:05:06.546072321Z" level=info msg="StartContainer for \"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\" returns successfully" Mar 20 22:05:06.558507 systemd[1]: cri-containerd-a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156.scope: Deactivated successfully. Mar 20 22:05:06.561088 containerd[1483]: time="2025-03-20T22:05:06.560862161Z" level=info msg="received exit event container_id:\"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\" id:\"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\" pid:3368 exited_at:{seconds:1742508306 nanos:560397350}" Mar 20 22:05:06.561642 containerd[1483]: time="2025-03-20T22:05:06.561399810Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\" id:\"a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156\" pid:3368 exited_at:{seconds:1742508306 nanos:560397350}" Mar 20 22:05:06.583550 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a570b706befba79da7960e9867bd00cf9cdba85b8146ee10dcf0cb24d880f156-rootfs.mount: Deactivated successfully. Mar 20 22:05:07.422104 kubelet[2706]: E0320 22:05:07.421979 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:07.535149 containerd[1483]: time="2025-03-20T22:05:07.535055031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 20 22:05:07.577418 kubelet[2706]: I0320 22:05:07.574328 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6f9cf7597f-f57j7" podStartSLOduration=4.406898705 podStartE2EDuration="7.574272727s" podCreationTimestamp="2025-03-20 22:05:00 +0000 UTC" firstStartedPulling="2025-03-20 22:05:01.171305033 +0000 UTC m=+12.867189440" lastFinishedPulling="2025-03-20 22:05:04.338679056 +0000 UTC m=+16.034563462" observedRunningTime="2025-03-20 22:05:04.535564446 +0000 UTC m=+16.231448862" watchObservedRunningTime="2025-03-20 22:05:07.574272727 +0000 UTC m=+19.270157184" Mar 20 22:05:09.422431 kubelet[2706]: E0320 22:05:09.421674 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:11.421469 kubelet[2706]: E0320 22:05:11.421375 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:13.421525 kubelet[2706]: E0320 22:05:13.421143 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:13.646592 containerd[1483]: time="2025-03-20T22:05:13.646543164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:13.649014 containerd[1483]: time="2025-03-20T22:05:13.648775903Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 20 22:05:13.650609 containerd[1483]: time="2025-03-20T22:05:13.650547285Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:13.653870 containerd[1483]: time="2025-03-20T22:05:13.653715226Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:13.654710 containerd[1483]: time="2025-03-20T22:05:13.654570931Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 6.118446675s" Mar 20 22:05:13.654710 containerd[1483]: time="2025-03-20T22:05:13.654605857Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 20 22:05:13.657556 containerd[1483]: time="2025-03-20T22:05:13.656943802Z" level=info msg="CreateContainer within sandbox \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 22:05:13.670162 containerd[1483]: time="2025-03-20T22:05:13.670123752Z" level=info msg="Container d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:13.682446 containerd[1483]: time="2025-03-20T22:05:13.682260705Z" level=info msg="CreateContainer within sandbox \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\"" Mar 20 22:05:13.684616 containerd[1483]: time="2025-03-20T22:05:13.682985024Z" level=info msg="StartContainer for \"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\"" Mar 20 22:05:13.684616 containerd[1483]: time="2025-03-20T22:05:13.684559617Z" level=info msg="connecting to shim d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b" address="unix:///run/containerd/s/0f50ea2d63e850ac82c365b63399c61ef906eecef9a3e945fb45866be88c3210" protocol=ttrpc version=3 Mar 20 22:05:13.706768 systemd[1]: Started cri-containerd-d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b.scope - libcontainer container d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b. Mar 20 22:05:13.761396 containerd[1483]: time="2025-03-20T22:05:13.761328592Z" level=info msg="StartContainer for \"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\" returns successfully" Mar 20 22:05:15.073803 containerd[1483]: time="2025-03-20T22:05:15.073607829Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 22:05:15.080966 systemd[1]: cri-containerd-d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b.scope: Deactivated successfully. Mar 20 22:05:15.084520 containerd[1483]: time="2025-03-20T22:05:15.083062542Z" level=info msg="received exit event container_id:\"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\" id:\"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\" pid:3427 exited_at:{seconds:1742508315 nanos:82386454}" Mar 20 22:05:15.084520 containerd[1483]: time="2025-03-20T22:05:15.083539366Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\" id:\"d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b\" pid:3427 exited_at:{seconds:1742508315 nanos:82386454}" Mar 20 22:05:15.081478 systemd[1]: cri-containerd-d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b.scope: Consumed 753ms CPU time, 174.9M memory peak, 154M written to disk. Mar 20 22:05:15.116784 kubelet[2706]: I0320 22:05:15.116709 2706 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 20 22:05:15.137972 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d9bd2a31d3c830be8a4c65ee54fb944a3eeeedc7e7ea23250dc63dcbae1f0b2b-rootfs.mount: Deactivated successfully. Mar 20 22:05:15.436227 systemd[1]: Created slice kubepods-besteffort-podb6039ed2_4638_4c6e_9f5f_d86da6536f5a.slice - libcontainer container kubepods-besteffort-podb6039ed2_4638_4c6e_9f5f_d86da6536f5a.slice. Mar 20 22:05:15.457872 containerd[1483]: time="2025-03-20T22:05:15.457002347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4r6v,Uid:b6039ed2-4638-4c6e-9f5f-d86da6536f5a,Namespace:calico-system,Attempt:0,}" Mar 20 22:05:15.495347 systemd[1]: Created slice kubepods-besteffort-pod40290dea_e62b_4abd_9ee1_590a63244daf.slice - libcontainer container kubepods-besteffort-pod40290dea_e62b_4abd_9ee1_590a63244daf.slice. Mar 20 22:05:15.509756 systemd[1]: Created slice kubepods-burstable-pod99039035_a126_40f6_8e23_9a343003a64b.slice - libcontainer container kubepods-burstable-pod99039035_a126_40f6_8e23_9a343003a64b.slice. Mar 20 22:05:15.517510 systemd[1]: Created slice kubepods-burstable-pod0907eb09_9d96_4133_9555_719a2d336ea7.slice - libcontainer container kubepods-burstable-pod0907eb09_9d96_4133_9555_719a2d336ea7.slice. Mar 20 22:05:15.527050 systemd[1]: Created slice kubepods-besteffort-pod5e77843c_1514_4098_aab0_d4016a6fe974.slice - libcontainer container kubepods-besteffort-pod5e77843c_1514_4098_aab0_d4016a6fe974.slice. Mar 20 22:05:15.531850 kubelet[2706]: I0320 22:05:15.531797 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0907eb09-9d96-4133-9555-719a2d336ea7-config-volume\") pod \"coredns-6f6b679f8f-2gj7w\" (UID: \"0907eb09-9d96-4133-9555-719a2d336ea7\") " pod="kube-system/coredns-6f6b679f8f-2gj7w" Mar 20 22:05:15.532708 kubelet[2706]: I0320 22:05:15.532213 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bf8b\" (UniqueName: \"kubernetes.io/projected/5e77843c-1514-4098-aab0-d4016a6fe974-kube-api-access-2bf8b\") pod \"calico-apiserver-84f7684999-sdktv\" (UID: \"5e77843c-1514-4098-aab0-d4016a6fe974\") " pod="calico-apiserver/calico-apiserver-84f7684999-sdktv" Mar 20 22:05:15.532708 kubelet[2706]: I0320 22:05:15.532309 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/95da2651-be83-4d81-b360-c868f8c2250b-calico-apiserver-certs\") pod \"calico-apiserver-84f7684999-262tz\" (UID: \"95da2651-be83-4d81-b360-c868f8c2250b\") " pod="calico-apiserver/calico-apiserver-84f7684999-262tz" Mar 20 22:05:15.532708 kubelet[2706]: I0320 22:05:15.532386 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40290dea-e62b-4abd-9ee1-590a63244daf-tigera-ca-bundle\") pod \"calico-kube-controllers-568f5b4f88-fx6tk\" (UID: \"40290dea-e62b-4abd-9ee1-590a63244daf\") " pod="calico-system/calico-kube-controllers-568f5b4f88-fx6tk" Mar 20 22:05:15.532708 kubelet[2706]: I0320 22:05:15.532519 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sbn6\" (UniqueName: \"kubernetes.io/projected/40290dea-e62b-4abd-9ee1-590a63244daf-kube-api-access-6sbn6\") pod \"calico-kube-controllers-568f5b4f88-fx6tk\" (UID: \"40290dea-e62b-4abd-9ee1-590a63244daf\") " pod="calico-system/calico-kube-controllers-568f5b4f88-fx6tk" Mar 20 22:05:15.533245 kubelet[2706]: I0320 22:05:15.532593 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z282g\" (UniqueName: \"kubernetes.io/projected/99039035-a126-40f6-8e23-9a343003a64b-kube-api-access-z282g\") pod \"coredns-6f6b679f8f-ctp7t\" (UID: \"99039035-a126-40f6-8e23-9a343003a64b\") " pod="kube-system/coredns-6f6b679f8f-ctp7t" Mar 20 22:05:15.533687 kubelet[2706]: I0320 22:05:15.533458 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5e77843c-1514-4098-aab0-d4016a6fe974-calico-apiserver-certs\") pod \"calico-apiserver-84f7684999-sdktv\" (UID: \"5e77843c-1514-4098-aab0-d4016a6fe974\") " pod="calico-apiserver/calico-apiserver-84f7684999-sdktv" Mar 20 22:05:15.533687 kubelet[2706]: I0320 22:05:15.533604 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcc9l\" (UniqueName: \"kubernetes.io/projected/95da2651-be83-4d81-b360-c868f8c2250b-kube-api-access-tcc9l\") pod \"calico-apiserver-84f7684999-262tz\" (UID: \"95da2651-be83-4d81-b360-c868f8c2250b\") " pod="calico-apiserver/calico-apiserver-84f7684999-262tz" Mar 20 22:05:15.535671 kubelet[2706]: I0320 22:05:15.533958 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm6n6\" (UniqueName: \"kubernetes.io/projected/0907eb09-9d96-4133-9555-719a2d336ea7-kube-api-access-vm6n6\") pod \"coredns-6f6b679f8f-2gj7w\" (UID: \"0907eb09-9d96-4133-9555-719a2d336ea7\") " pod="kube-system/coredns-6f6b679f8f-2gj7w" Mar 20 22:05:15.535671 kubelet[2706]: I0320 22:05:15.534837 2706 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99039035-a126-40f6-8e23-9a343003a64b-config-volume\") pod \"coredns-6f6b679f8f-ctp7t\" (UID: \"99039035-a126-40f6-8e23-9a343003a64b\") " pod="kube-system/coredns-6f6b679f8f-ctp7t" Mar 20 22:05:15.540867 systemd[1]: Created slice kubepods-besteffort-pod95da2651_be83_4d81_b360_c868f8c2250b.slice - libcontainer container kubepods-besteffort-pod95da2651_be83_4d81_b360_c868f8c2250b.slice. Mar 20 22:05:16.106664 containerd[1483]: time="2025-03-20T22:05:16.106474436Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568f5b4f88-fx6tk,Uid:40290dea-e62b-4abd-9ee1-590a63244daf,Namespace:calico-system,Attempt:0,}" Mar 20 22:05:16.119055 containerd[1483]: time="2025-03-20T22:05:16.118766561Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ctp7t,Uid:99039035-a126-40f6-8e23-9a343003a64b,Namespace:kube-system,Attempt:0,}" Mar 20 22:05:16.127691 containerd[1483]: time="2025-03-20T22:05:16.125927001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2gj7w,Uid:0907eb09-9d96-4133-9555-719a2d336ea7,Namespace:kube-system,Attempt:0,}" Mar 20 22:05:16.138052 containerd[1483]: time="2025-03-20T22:05:16.137973004Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-sdktv,Uid:5e77843c-1514-4098-aab0-d4016a6fe974,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:05:16.149770 containerd[1483]: time="2025-03-20T22:05:16.148466204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-262tz,Uid:95da2651-be83-4d81-b360-c868f8c2250b,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:05:16.393025 containerd[1483]: time="2025-03-20T22:05:16.392662597Z" level=error msg="Failed to destroy network for sandbox \"941defde4dec753600890046362f4c51a68486355a73afcb9c552db0f87d5179\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.396993 containerd[1483]: time="2025-03-20T22:05:16.396950389Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4r6v,Uid:b6039ed2-4638-4c6e-9f5f-d86da6536f5a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"941defde4dec753600890046362f4c51a68486355a73afcb9c552db0f87d5179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.398423 kubelet[2706]: E0320 22:05:16.398365 2706 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"941defde4dec753600890046362f4c51a68486355a73afcb9c552db0f87d5179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.398748 kubelet[2706]: E0320 22:05:16.398449 2706 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"941defde4dec753600890046362f4c51a68486355a73afcb9c552db0f87d5179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:16.398748 kubelet[2706]: E0320 22:05:16.398474 2706 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"941defde4dec753600890046362f4c51a68486355a73afcb9c552db0f87d5179\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b4r6v" Mar 20 22:05:16.398748 kubelet[2706]: E0320 22:05:16.398529 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-b4r6v_calico-system(b6039ed2-4638-4c6e-9f5f-d86da6536f5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-b4r6v_calico-system(b6039ed2-4638-4c6e-9f5f-d86da6536f5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"941defde4dec753600890046362f4c51a68486355a73afcb9c552db0f87d5179\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-b4r6v" podUID="b6039ed2-4638-4c6e-9f5f-d86da6536f5a" Mar 20 22:05:16.399928 containerd[1483]: time="2025-03-20T22:05:16.399217080Z" level=error msg="Failed to destroy network for sandbox \"b837696413fa8f7f8eb6947ecee83a7c900bba50e1fcf08a60305d7641749910\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.402346 containerd[1483]: time="2025-03-20T22:05:16.402286056Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-sdktv,Uid:5e77843c-1514-4098-aab0-d4016a6fe974,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b837696413fa8f7f8eb6947ecee83a7c900bba50e1fcf08a60305d7641749910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.403226 kubelet[2706]: E0320 22:05:16.402550 2706 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b837696413fa8f7f8eb6947ecee83a7c900bba50e1fcf08a60305d7641749910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.403226 kubelet[2706]: E0320 22:05:16.402615 2706 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b837696413fa8f7f8eb6947ecee83a7c900bba50e1fcf08a60305d7641749910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f7684999-sdktv" Mar 20 22:05:16.403226 kubelet[2706]: E0320 22:05:16.402656 2706 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b837696413fa8f7f8eb6947ecee83a7c900bba50e1fcf08a60305d7641749910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f7684999-sdktv" Mar 20 22:05:16.403339 kubelet[2706]: E0320 22:05:16.402714 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84f7684999-sdktv_calico-apiserver(5e77843c-1514-4098-aab0-d4016a6fe974)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84f7684999-sdktv_calico-apiserver(5e77843c-1514-4098-aab0-d4016a6fe974)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b837696413fa8f7f8eb6947ecee83a7c900bba50e1fcf08a60305d7641749910\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84f7684999-sdktv" podUID="5e77843c-1514-4098-aab0-d4016a6fe974" Mar 20 22:05:16.426540 containerd[1483]: time="2025-03-20T22:05:16.426497936Z" level=error msg="Failed to destroy network for sandbox \"5a0e13618b64988ddea43b16402d3125276b6d7481b620da67e88ff70a6ac45a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.428584 containerd[1483]: time="2025-03-20T22:05:16.428534466Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ctp7t,Uid:99039035-a126-40f6-8e23-9a343003a64b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0e13618b64988ddea43b16402d3125276b6d7481b620da67e88ff70a6ac45a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.429975 kubelet[2706]: E0320 22:05:16.428801 2706 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0e13618b64988ddea43b16402d3125276b6d7481b620da67e88ff70a6ac45a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.429975 kubelet[2706]: E0320 22:05:16.428859 2706 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0e13618b64988ddea43b16402d3125276b6d7481b620da67e88ff70a6ac45a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-ctp7t" Mar 20 22:05:16.429975 kubelet[2706]: E0320 22:05:16.428882 2706 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0e13618b64988ddea43b16402d3125276b6d7481b620da67e88ff70a6ac45a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-ctp7t" Mar 20 22:05:16.430142 kubelet[2706]: E0320 22:05:16.428945 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-ctp7t_kube-system(99039035-a126-40f6-8e23-9a343003a64b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-ctp7t_kube-system(99039035-a126-40f6-8e23-9a343003a64b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a0e13618b64988ddea43b16402d3125276b6d7481b620da67e88ff70a6ac45a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-ctp7t" podUID="99039035-a126-40f6-8e23-9a343003a64b" Mar 20 22:05:16.434761 containerd[1483]: time="2025-03-20T22:05:16.434717403Z" level=error msg="Failed to destroy network for sandbox \"2e782d9dd8407018b559fd06a79c2d2615d39c59d6e4effcf63e579f8c58bab8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.435558 containerd[1483]: time="2025-03-20T22:05:16.435475144Z" level=error msg="Failed to destroy network for sandbox \"aa7a9d281b048bced988a059e334da0d3d32a2ccc4914b24e32971e40fc765e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.437389 containerd[1483]: time="2025-03-20T22:05:16.437337627Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2gj7w,Uid:0907eb09-9d96-4133-9555-719a2d336ea7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e782d9dd8407018b559fd06a79c2d2615d39c59d6e4effcf63e579f8c58bab8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.437863 kubelet[2706]: E0320 22:05:16.437788 2706 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e782d9dd8407018b559fd06a79c2d2615d39c59d6e4effcf63e579f8c58bab8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.437863 kubelet[2706]: E0320 22:05:16.437849 2706 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e782d9dd8407018b559fd06a79c2d2615d39c59d6e4effcf63e579f8c58bab8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-2gj7w" Mar 20 22:05:16.437981 kubelet[2706]: E0320 22:05:16.437872 2706 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e782d9dd8407018b559fd06a79c2d2615d39c59d6e4effcf63e579f8c58bab8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-2gj7w" Mar 20 22:05:16.437981 kubelet[2706]: E0320 22:05:16.437918 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-2gj7w_kube-system(0907eb09-9d96-4133-9555-719a2d336ea7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-2gj7w_kube-system(0907eb09-9d96-4133-9555-719a2d336ea7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e782d9dd8407018b559fd06a79c2d2615d39c59d6e4effcf63e579f8c58bab8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-2gj7w" podUID="0907eb09-9d96-4133-9555-719a2d336ea7" Mar 20 22:05:16.440246 containerd[1483]: time="2025-03-20T22:05:16.439921443Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568f5b4f88-fx6tk,Uid:40290dea-e62b-4abd-9ee1-590a63244daf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa7a9d281b048bced988a059e334da0d3d32a2ccc4914b24e32971e40fc765e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.440361 kubelet[2706]: E0320 22:05:16.440226 2706 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa7a9d281b048bced988a059e334da0d3d32a2ccc4914b24e32971e40fc765e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.440361 kubelet[2706]: E0320 22:05:16.440279 2706 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa7a9d281b048bced988a059e334da0d3d32a2ccc4914b24e32971e40fc765e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568f5b4f88-fx6tk" Mar 20 22:05:16.440361 kubelet[2706]: E0320 22:05:16.440299 2706 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa7a9d281b048bced988a059e334da0d3d32a2ccc4914b24e32971e40fc765e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-568f5b4f88-fx6tk" Mar 20 22:05:16.440540 kubelet[2706]: E0320 22:05:16.440346 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-568f5b4f88-fx6tk_calico-system(40290dea-e62b-4abd-9ee1-590a63244daf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-568f5b4f88-fx6tk_calico-system(40290dea-e62b-4abd-9ee1-590a63244daf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa7a9d281b048bced988a059e334da0d3d32a2ccc4914b24e32971e40fc765e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-568f5b4f88-fx6tk" podUID="40290dea-e62b-4abd-9ee1-590a63244daf" Mar 20 22:05:16.444149 containerd[1483]: time="2025-03-20T22:05:16.444082617Z" level=error msg="Failed to destroy network for sandbox \"0ec088c7ab155f283da232cff0e4d2f948539e66884d899ef79be249fb2f7886\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.446065 containerd[1483]: time="2025-03-20T22:05:16.446022045Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-262tz,Uid:95da2651-be83-4d81-b360-c868f8c2250b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec088c7ab155f283da232cff0e4d2f948539e66884d899ef79be249fb2f7886\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.446429 kubelet[2706]: E0320 22:05:16.446197 2706 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec088c7ab155f283da232cff0e4d2f948539e66884d899ef79be249fb2f7886\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 22:05:16.446429 kubelet[2706]: E0320 22:05:16.446243 2706 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec088c7ab155f283da232cff0e4d2f948539e66884d899ef79be249fb2f7886\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f7684999-262tz" Mar 20 22:05:16.446429 kubelet[2706]: E0320 22:05:16.446264 2706 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec088c7ab155f283da232cff0e4d2f948539e66884d899ef79be249fb2f7886\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-84f7684999-262tz" Mar 20 22:05:16.446697 kubelet[2706]: E0320 22:05:16.446306 2706 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-84f7684999-262tz_calico-apiserver(95da2651-be83-4d81-b360-c868f8c2250b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-84f7684999-262tz_calico-apiserver(95da2651-be83-4d81-b360-c868f8c2250b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ec088c7ab155f283da232cff0e4d2f948539e66884d899ef79be249fb2f7886\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-84f7684999-262tz" podUID="95da2651-be83-4d81-b360-c868f8c2250b" Mar 20 22:05:16.588596 containerd[1483]: time="2025-03-20T22:05:16.588106903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 20 22:05:17.141051 systemd[1]: run-netns-cni\x2df9e25d0f\x2d201f\x2df210\x2d3b01\x2dabe854b24dee.mount: Deactivated successfully. Mar 20 22:05:17.141307 systemd[1]: run-netns-cni\x2d39ffb23b\x2da1fe\x2d38b4\x2d948f\x2d134f37f3fca7.mount: Deactivated successfully. Mar 20 22:05:17.141489 systemd[1]: run-netns-cni\x2d30d53820\x2d5f11\x2d1370\x2d18da\x2dcd3e69e39584.mount: Deactivated successfully. Mar 20 22:05:17.141727 systemd[1]: run-netns-cni\x2d05f80289\x2d4e4a\x2d37b5\x2d625f\x2d1cb1e2c9a419.mount: Deactivated successfully. Mar 20 22:05:17.141903 systemd[1]: run-netns-cni\x2dd582bd7a\x2d4fe1\x2d8897\x2dfd12\x2d371fd2095c78.mount: Deactivated successfully. Mar 20 22:05:21.325842 kubelet[2706]: I0320 22:05:21.325037 2706 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:05:25.076793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount150029225.mount: Deactivated successfully. Mar 20 22:05:25.118088 containerd[1483]: time="2025-03-20T22:05:25.118040267Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:25.120214 containerd[1483]: time="2025-03-20T22:05:25.120128777Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 20 22:05:25.122822 containerd[1483]: time="2025-03-20T22:05:25.122745053Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:25.126906 containerd[1483]: time="2025-03-20T22:05:25.126847032Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:25.128065 containerd[1483]: time="2025-03-20T22:05:25.127958839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 8.53978416s" Mar 20 22:05:25.128065 containerd[1483]: time="2025-03-20T22:05:25.127989087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 20 22:05:25.141075 containerd[1483]: time="2025-03-20T22:05:25.140986128Z" level=info msg="CreateContainer within sandbox \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 22:05:25.159681 containerd[1483]: time="2025-03-20T22:05:25.157772788Z" level=info msg="Container 059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:25.171673 containerd[1483]: time="2025-03-20T22:05:25.171617367Z" level=info msg="CreateContainer within sandbox \"bac8ff37e8b4250c36c84b627134ba70702e266797c9b982ec84ebcd939e99df\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\"" Mar 20 22:05:25.172461 containerd[1483]: time="2025-03-20T22:05:25.172425011Z" level=info msg="StartContainer for \"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\"" Mar 20 22:05:25.174456 containerd[1483]: time="2025-03-20T22:05:25.174435524Z" level=info msg="connecting to shim 059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb" address="unix:///run/containerd/s/0f50ea2d63e850ac82c365b63399c61ef906eecef9a3e945fb45866be88c3210" protocol=ttrpc version=3 Mar 20 22:05:25.206771 systemd[1]: Started cri-containerd-059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb.scope - libcontainer container 059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb. Mar 20 22:05:25.254252 containerd[1483]: time="2025-03-20T22:05:25.254125791Z" level=info msg="StartContainer for \"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" returns successfully" Mar 20 22:05:25.321442 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 20 22:05:25.321700 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 20 22:05:27.019684 kernel: bpftool[3836]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 20 22:05:27.281734 systemd-networkd[1394]: vxlan.calico: Link UP Mar 20 22:05:27.281740 systemd-networkd[1394]: vxlan.calico: Gained carrier Mar 20 22:05:27.621491 kubelet[2706]: I0320 22:05:27.621371 2706 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:05:27.725209 containerd[1483]: time="2025-03-20T22:05:27.725124919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"be2fc9f5eec2b381b8a5a3c5403340deaf3af60843881fbd8c077e24b74fdc8a\" pid:3916 exit_status:1 exited_at:{seconds:1742508327 nanos:724416653}" Mar 20 22:05:27.805680 containerd[1483]: time="2025-03-20T22:05:27.805267561Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"191aa69db0eccdea0cdd56cc0b738f98905bb5badd0611a13990f1efd570d3e1\" pid:3943 exit_status:1 exited_at:{seconds:1742508327 nanos:804805349}" Mar 20 22:05:28.319087 systemd-networkd[1394]: vxlan.calico: Gained IPv6LL Mar 20 22:05:28.424917 containerd[1483]: time="2025-03-20T22:05:28.424106747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2gj7w,Uid:0907eb09-9d96-4133-9555-719a2d336ea7,Namespace:kube-system,Attempt:0,}" Mar 20 22:05:28.655064 systemd-networkd[1394]: calif1fdbe110e8: Link UP Mar 20 22:05:28.657885 systemd-networkd[1394]: calif1fdbe110e8: Gained carrier Mar 20 22:05:28.701229 kubelet[2706]: I0320 22:05:28.701173 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-h5t7c" podStartSLOduration=4.841289616 podStartE2EDuration="28.697896775s" podCreationTimestamp="2025-03-20 22:05:00 +0000 UTC" firstStartedPulling="2025-03-20 22:05:01.27253294 +0000 UTC m=+12.968417346" lastFinishedPulling="2025-03-20 22:05:25.129140089 +0000 UTC m=+36.825024505" observedRunningTime="2025-03-20 22:05:25.643609043 +0000 UTC m=+37.339493459" watchObservedRunningTime="2025-03-20 22:05:28.697896775 +0000 UTC m=+40.393781191" Mar 20 22:05:28.703207 containerd[1483]: 2025-03-20 22:05:28.550 [INFO][3955] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0 coredns-6f6b679f8f- kube-system 0907eb09-9d96-4133-9555-719a2d336ea7 705 0 2025-03-20 22:04:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-2-f-52bc1ad8d1.novalocal coredns-6f6b679f8f-2gj7w eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif1fdbe110e8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-" Mar 20 22:05:28.703207 containerd[1483]: 2025-03-20 22:05:28.550 [INFO][3955] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.703207 containerd[1483]: 2025-03-20 22:05:28.591 [INFO][3967] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" HandleID="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.604 [INFO][3967] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" HandleID="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001fc1a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-2-f-52bc1ad8d1.novalocal", "pod":"coredns-6f6b679f8f-2gj7w", "timestamp":"2025-03-20 22:05:28.591853946 +0000 UTC"}, Hostname:"ci-9999-0-2-f-52bc1ad8d1.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.604 [INFO][3967] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.604 [INFO][3967] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.604 [INFO][3967] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-f-52bc1ad8d1.novalocal' Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.607 [INFO][3967] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.612 [INFO][3967] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.619 [INFO][3967] ipam/ipam.go 489: Trying affinity for 192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.622 [INFO][3967] ipam/ipam.go 155: Attempting to load block cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.703389 containerd[1483]: 2025-03-20 22:05:28.624 [INFO][3967] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.625 [INFO][3967] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.2.192/26 handle="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.627 [INFO][3967] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34 Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.636 [INFO][3967] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.2.192/26 handle="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.643 [INFO][3967] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.2.193/26] block=192.168.2.192/26 handle="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.643 [INFO][3967] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.2.193/26] handle="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.644 [INFO][3967] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:05:28.705653 containerd[1483]: 2025-03-20 22:05:28.644 [INFO][3967] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.2.193/26] IPv6=[] ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" HandleID="k8s-pod-network.e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.705872 containerd[1483]: 2025-03-20 22:05:28.650 [INFO][3955] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"0907eb09-9d96-4133-9555-719a2d336ea7", ResourceVersion:"705", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 4, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"", Pod:"coredns-6f6b679f8f-2gj7w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.2.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif1fdbe110e8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:28.705872 containerd[1483]: 2025-03-20 22:05:28.650 [INFO][3955] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.2.193/32] ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.705872 containerd[1483]: 2025-03-20 22:05:28.650 [INFO][3955] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif1fdbe110e8 ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.705872 containerd[1483]: 2025-03-20 22:05:28.658 [INFO][3955] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.705872 containerd[1483]: 2025-03-20 22:05:28.662 [INFO][3955] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"0907eb09-9d96-4133-9555-719a2d336ea7", ResourceVersion:"705", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 4, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34", Pod:"coredns-6f6b679f8f-2gj7w", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.2.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif1fdbe110e8", MAC:"6a:85:ff:3b:5d:ff", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:28.705872 containerd[1483]: 2025-03-20 22:05:28.693 [INFO][3955] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" Namespace="kube-system" Pod="coredns-6f6b679f8f-2gj7w" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--2gj7w-eth0" Mar 20 22:05:28.783932 containerd[1483]: time="2025-03-20T22:05:28.783674019Z" level=info msg="connecting to shim e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34" address="unix:///run/containerd/s/9fd6121714d70672718ae9b3b3842062b3fcf4ecf4140807b5f3fb0c60c4d691" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:28.817781 systemd[1]: Started cri-containerd-e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34.scope - libcontainer container e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34. Mar 20 22:05:28.864370 containerd[1483]: time="2025-03-20T22:05:28.864320437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2gj7w,Uid:0907eb09-9d96-4133-9555-719a2d336ea7,Namespace:kube-system,Attempt:0,} returns sandbox id \"e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34\"" Mar 20 22:05:28.867959 containerd[1483]: time="2025-03-20T22:05:28.867479194Z" level=info msg="CreateContainer within sandbox \"e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 22:05:28.881536 containerd[1483]: time="2025-03-20T22:05:28.881478403Z" level=info msg="Container 3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:28.886533 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4000198233.mount: Deactivated successfully. Mar 20 22:05:28.893242 containerd[1483]: time="2025-03-20T22:05:28.893209294Z" level=info msg="CreateContainer within sandbox \"e200c35a03bf3a2f57a817a3701df97475f4f051769de03071b5078e09c58d34\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa\"" Mar 20 22:05:28.894717 containerd[1483]: time="2025-03-20T22:05:28.894004263Z" level=info msg="StartContainer for \"3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa\"" Mar 20 22:05:28.895294 containerd[1483]: time="2025-03-20T22:05:28.895251044Z" level=info msg="connecting to shim 3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa" address="unix:///run/containerd/s/9fd6121714d70672718ae9b3b3842062b3fcf4ecf4140807b5f3fb0c60c4d691" protocol=ttrpc version=3 Mar 20 22:05:28.914760 systemd[1]: Started cri-containerd-3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa.scope - libcontainer container 3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa. Mar 20 22:05:28.949668 containerd[1483]: time="2025-03-20T22:05:28.949065915Z" level=info msg="StartContainer for \"3686c0f1bb7b7ee65b2c7ec93c1c0e2429ae38ecc24a01d7ec5a9301ab8558fa\" returns successfully" Mar 20 22:05:29.422919 containerd[1483]: time="2025-03-20T22:05:29.422848260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-262tz,Uid:95da2651-be83-4d81-b360-c868f8c2250b,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:05:29.423717 containerd[1483]: time="2025-03-20T22:05:29.423307105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ctp7t,Uid:99039035-a126-40f6-8e23-9a343003a64b,Namespace:kube-system,Attempt:0,}" Mar 20 22:05:29.649547 systemd-networkd[1394]: califaef00d3a45: Link UP Mar 20 22:05:29.653117 systemd-networkd[1394]: califaef00d3a45: Gained carrier Mar 20 22:05:29.753681 kubelet[2706]: I0320 22:05:29.751400 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-2gj7w" podStartSLOduration=36.751360297 podStartE2EDuration="36.751360297s" podCreationTimestamp="2025-03-20 22:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:05:29.70335224 +0000 UTC m=+41.399236646" watchObservedRunningTime="2025-03-20 22:05:29.751360297 +0000 UTC m=+41.447244703" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.524 [INFO][4067] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0 coredns-6f6b679f8f- kube-system 99039035-a126-40f6-8e23-9a343003a64b 711 0 2025-03-20 22:04:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-2-f-52bc1ad8d1.novalocal coredns-6f6b679f8f-ctp7t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califaef00d3a45 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.524 [INFO][4067] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.571 [INFO][4091] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" HandleID="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.582 [INFO][4091] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" HandleID="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031dc40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-2-f-52bc1ad8d1.novalocal", "pod":"coredns-6f6b679f8f-ctp7t", "timestamp":"2025-03-20 22:05:29.57111978 +0000 UTC"}, Hostname:"ci-9999-0-2-f-52bc1ad8d1.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.582 [INFO][4091] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.582 [INFO][4091] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.582 [INFO][4091] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-f-52bc1ad8d1.novalocal' Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.585 [INFO][4091] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.589 [INFO][4091] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.595 [INFO][4091] ipam/ipam.go 489: Trying affinity for 192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.597 [INFO][4091] ipam/ipam.go 155: Attempting to load block cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.600 [INFO][4091] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.600 [INFO][4091] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.2.192/26 handle="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.602 [INFO][4091] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097 Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.614 [INFO][4091] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.2.192/26 handle="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.642 [INFO][4091] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.2.194/26] block=192.168.2.192/26 handle="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.643 [INFO][4091] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.2.194/26] handle="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.643 [INFO][4091] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:05:29.754423 containerd[1483]: 2025-03-20 22:05:29.643 [INFO][4091] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.2.194/26] IPv6=[] ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" HandleID="k8s-pod-network.7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.758952 containerd[1483]: 2025-03-20 22:05:29.645 [INFO][4067] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"99039035-a126-40f6-8e23-9a343003a64b", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 4, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"", Pod:"coredns-6f6b679f8f-ctp7t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.2.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califaef00d3a45", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:29.758952 containerd[1483]: 2025-03-20 22:05:29.646 [INFO][4067] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.2.194/32] ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.758952 containerd[1483]: 2025-03-20 22:05:29.646 [INFO][4067] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califaef00d3a45 ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.758952 containerd[1483]: 2025-03-20 22:05:29.652 [INFO][4067] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.758952 containerd[1483]: 2025-03-20 22:05:29.653 [INFO][4067] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"99039035-a126-40f6-8e23-9a343003a64b", ResourceVersion:"711", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 4, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097", Pod:"coredns-6f6b679f8f-ctp7t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.2.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califaef00d3a45", MAC:"92:fb:b8:4a:74:39", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:29.758952 containerd[1483]: 2025-03-20 22:05:29.747 [INFO][4067] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" Namespace="kube-system" Pod="coredns-6f6b679f8f-ctp7t" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-coredns--6f6b679f8f--ctp7t-eth0" Mar 20 22:05:29.982892 systemd-networkd[1394]: calid8b724188e8: Link UP Mar 20 22:05:29.983064 systemd-networkd[1394]: calid8b724188e8: Gained carrier Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.523 [INFO][4073] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0 calico-apiserver-84f7684999- calico-apiserver 95da2651-be83-4d81-b360-c868f8c2250b 714 0 2025-03-20 22:05:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84f7684999 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-2-f-52bc1ad8d1.novalocal calico-apiserver-84f7684999-262tz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid8b724188e8 [] []}} ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.525 [INFO][4073] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.565 [INFO][4093] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" HandleID="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.582 [INFO][4093] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" HandleID="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031d620), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-2-f-52bc1ad8d1.novalocal", "pod":"calico-apiserver-84f7684999-262tz", "timestamp":"2025-03-20 22:05:29.565486709 +0000 UTC"}, Hostname:"ci-9999-0-2-f-52bc1ad8d1.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.583 [INFO][4093] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.643 [INFO][4093] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.643 [INFO][4093] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-f-52bc1ad8d1.novalocal' Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.686 [INFO][4093] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.762 [INFO][4093] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.903 [INFO][4093] ipam/ipam.go 489: Trying affinity for 192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.919 [INFO][4093] ipam/ipam.go 155: Attempting to load block cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.925 [INFO][4093] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.925 [INFO][4093] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.2.192/26 handle="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.928 [INFO][4093] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.945 [INFO][4093] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.2.192/26 handle="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.974 [INFO][4093] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.2.195/26] block=192.168.2.192/26 handle="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.974 [INFO][4093] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.2.195/26] handle="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.974 [INFO][4093] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:05:30.137790 containerd[1483]: 2025-03-20 22:05:29.974 [INFO][4093] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.2.195/26] IPv6=[] ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" HandleID="k8s-pod-network.029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.144137 containerd[1483]: 2025-03-20 22:05:29.977 [INFO][4073] cni-plugin/k8s.go 386: Populated endpoint ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0", GenerateName:"calico-apiserver-84f7684999-", Namespace:"calico-apiserver", SelfLink:"", UID:"95da2651-be83-4d81-b360-c868f8c2250b", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f7684999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"", Pod:"calico-apiserver-84f7684999-262tz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.2.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid8b724188e8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:30.144137 containerd[1483]: 2025-03-20 22:05:29.977 [INFO][4073] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.2.195/32] ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.144137 containerd[1483]: 2025-03-20 22:05:29.979 [INFO][4073] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid8b724188e8 ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.144137 containerd[1483]: 2025-03-20 22:05:29.983 [INFO][4073] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.144137 containerd[1483]: 2025-03-20 22:05:29.985 [INFO][4073] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0", GenerateName:"calico-apiserver-84f7684999-", Namespace:"calico-apiserver", SelfLink:"", UID:"95da2651-be83-4d81-b360-c868f8c2250b", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f7684999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f", Pod:"calico-apiserver-84f7684999-262tz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.2.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid8b724188e8", MAC:"da:04:21:48:27:f3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:30.144137 containerd[1483]: 2025-03-20 22:05:30.127 [INFO][4073] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-262tz" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--262tz-eth0" Mar 20 22:05:30.176036 systemd-networkd[1394]: calif1fdbe110e8: Gained IPv6LL Mar 20 22:05:30.424312 containerd[1483]: time="2025-03-20T22:05:30.423110423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4r6v,Uid:b6039ed2-4638-4c6e-9f5f-d86da6536f5a,Namespace:calico-system,Attempt:0,}" Mar 20 22:05:30.425918 containerd[1483]: time="2025-03-20T22:05:30.425852803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-sdktv,Uid:5e77843c-1514-4098-aab0-d4016a6fe974,Namespace:calico-apiserver,Attempt:0,}" Mar 20 22:05:30.829492 containerd[1483]: time="2025-03-20T22:05:30.828879435Z" level=info msg="connecting to shim 7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097" address="unix:///run/containerd/s/81c97d52cfac64c4022af9fcb596f261ecf752093c8b4c790b77870f36f5a68f" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:30.866284 containerd[1483]: time="2025-03-20T22:05:30.866219008Z" level=info msg="connecting to shim 029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f" address="unix:///run/containerd/s/7bf2d900f312aa5654f958ad227f76e276bf4d4dd96af0da75b704c099c2fc73" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:30.911843 systemd[1]: Started cri-containerd-7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097.scope - libcontainer container 7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097. Mar 20 22:05:30.965811 systemd[1]: Started cri-containerd-029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f.scope - libcontainer container 029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f. Mar 20 22:05:31.026505 containerd[1483]: time="2025-03-20T22:05:31.026442393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ctp7t,Uid:99039035-a126-40f6-8e23-9a343003a64b,Namespace:kube-system,Attempt:0,} returns sandbox id \"7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097\"" Mar 20 22:05:31.032685 containerd[1483]: time="2025-03-20T22:05:31.032503909Z" level=info msg="CreateContainer within sandbox \"7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 22:05:31.053868 containerd[1483]: time="2025-03-20T22:05:31.053824967Z" level=info msg="Container 22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:31.068830 systemd-networkd[1394]: cali3a2e8dd32f0: Link UP Mar 20 22:05:31.071378 systemd-networkd[1394]: cali3a2e8dd32f0: Gained carrier Mar 20 22:05:31.076380 containerd[1483]: time="2025-03-20T22:05:31.076324147Z" level=info msg="CreateContainer within sandbox \"7cd75995a6dff3786182ed134cdf076680224efdbc12546dd84ebe7ba5268097\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7\"" Mar 20 22:05:31.077921 containerd[1483]: time="2025-03-20T22:05:31.077890099Z" level=info msg="StartContainer for \"22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7\"" Mar 20 22:05:31.082388 containerd[1483]: time="2025-03-20T22:05:31.082286365Z" level=info msg="connecting to shim 22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7" address="unix:///run/containerd/s/81c97d52cfac64c4022af9fcb596f261ecf752093c8b4c790b77870f36f5a68f" protocol=ttrpc version=3 Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.869 [INFO][4137] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0 calico-apiserver-84f7684999- calico-apiserver 5e77843c-1514-4098-aab0-d4016a6fe974 713 0 2025-03-20 22:05:00 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:84f7684999 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-2-f-52bc1ad8d1.novalocal calico-apiserver-84f7684999-sdktv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3a2e8dd32f0 [] []}} ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.869 [INFO][4137] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.970 [INFO][4207] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" HandleID="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.984 [INFO][4207] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" HandleID="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000334c20), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-2-f-52bc1ad8d1.novalocal", "pod":"calico-apiserver-84f7684999-sdktv", "timestamp":"2025-03-20 22:05:30.97076155 +0000 UTC"}, Hostname:"ci-9999-0-2-f-52bc1ad8d1.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.984 [INFO][4207] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.984 [INFO][4207] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.984 [INFO][4207] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-f-52bc1ad8d1.novalocal' Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.990 [INFO][4207] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:30.996 [INFO][4207] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.012 [INFO][4207] ipam/ipam.go 489: Trying affinity for 192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.018 [INFO][4207] ipam/ipam.go 155: Attempting to load block cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.027 [INFO][4207] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.027 [INFO][4207] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.2.192/26 handle="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.030 [INFO][4207] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8 Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.043 [INFO][4207] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.2.192/26 handle="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.057 [INFO][4207] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.2.196/26] block=192.168.2.192/26 handle="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.057 [INFO][4207] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.2.196/26] handle="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.057 [INFO][4207] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:05:31.107050 containerd[1483]: 2025-03-20 22:05:31.057 [INFO][4207] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.2.196/26] IPv6=[] ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" HandleID="k8s-pod-network.631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.107907 containerd[1483]: 2025-03-20 22:05:31.061 [INFO][4137] cni-plugin/k8s.go 386: Populated endpoint ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0", GenerateName:"calico-apiserver-84f7684999-", Namespace:"calico-apiserver", SelfLink:"", UID:"5e77843c-1514-4098-aab0-d4016a6fe974", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f7684999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"", Pod:"calico-apiserver-84f7684999-sdktv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.2.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3a2e8dd32f0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:31.107907 containerd[1483]: 2025-03-20 22:05:31.062 [INFO][4137] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.2.196/32] ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.107907 containerd[1483]: 2025-03-20 22:05:31.062 [INFO][4137] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a2e8dd32f0 ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.107907 containerd[1483]: 2025-03-20 22:05:31.071 [INFO][4137] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.107907 containerd[1483]: 2025-03-20 22:05:31.073 [INFO][4137] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0", GenerateName:"calico-apiserver-84f7684999-", Namespace:"calico-apiserver", SelfLink:"", UID:"5e77843c-1514-4098-aab0-d4016a6fe974", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"84f7684999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8", Pod:"calico-apiserver-84f7684999-sdktv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.2.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3a2e8dd32f0", MAC:"9a:bc:a7:00:73:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:31.107907 containerd[1483]: 2025-03-20 22:05:31.104 [INFO][4137] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" Namespace="calico-apiserver" Pod="calico-apiserver-84f7684999-sdktv" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--apiserver--84f7684999--sdktv-eth0" Mar 20 22:05:31.134835 systemd-networkd[1394]: calid8b724188e8: Gained IPv6LL Mar 20 22:05:31.141866 systemd[1]: Started cri-containerd-22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7.scope - libcontainer container 22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7. Mar 20 22:05:31.173220 containerd[1483]: time="2025-03-20T22:05:31.171151490Z" level=info msg="connecting to shim 631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8" address="unix:///run/containerd/s/62a59550b5a698886a0933dc2539d78ba02d65dac4846b0846bb54c8d9887809" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:31.197546 systemd-networkd[1394]: cali7b1afe24140: Link UP Mar 20 22:05:31.200439 systemd-networkd[1394]: cali7b1afe24140: Gained carrier Mar 20 22:05:31.245775 systemd[1]: Started cri-containerd-631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8.scope - libcontainer container 631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8. Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:30.881 [INFO][4146] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0 csi-node-driver- calico-system b6039ed2-4638-4c6e-9f5f-d86da6536f5a 616 0 2025-03-20 22:05:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-0-2-f-52bc1ad8d1.novalocal csi-node-driver-b4r6v eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7b1afe24140 [] []}} ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:30.881 [INFO][4146] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:30.974 [INFO][4216] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" HandleID="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:30.997 [INFO][4216] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" HandleID="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051b80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-2-f-52bc1ad8d1.novalocal", "pod":"csi-node-driver-b4r6v", "timestamp":"2025-03-20 22:05:30.974759577 +0000 UTC"}, Hostname:"ci-9999-0-2-f-52bc1ad8d1.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:30.997 [INFO][4216] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.057 [INFO][4216] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.057 [INFO][4216] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-f-52bc1ad8d1.novalocal' Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.093 [INFO][4216] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.118 [INFO][4216] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.129 [INFO][4216] ipam/ipam.go 489: Trying affinity for 192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.138 [INFO][4216] ipam/ipam.go 155: Attempting to load block cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.146 [INFO][4216] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.146 [INFO][4216] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.2.192/26 handle="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.151 [INFO][4216] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6 Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.161 [INFO][4216] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.2.192/26 handle="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.176 [INFO][4216] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.2.197/26] block=192.168.2.192/26 handle="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.177 [INFO][4216] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.2.197/26] handle="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.177 [INFO][4216] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:05:31.251533 containerd[1483]: 2025-03-20 22:05:31.177 [INFO][4216] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.2.197/26] IPv6=[] ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" HandleID="k8s-pod-network.b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.252488 containerd[1483]: 2025-03-20 22:05:31.186 [INFO][4146] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b6039ed2-4638-4c6e-9f5f-d86da6536f5a", ResourceVersion:"616", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"", Pod:"csi-node-driver-b4r6v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.2.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7b1afe24140", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:31.252488 containerd[1483]: 2025-03-20 22:05:31.186 [INFO][4146] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.2.197/32] ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.252488 containerd[1483]: 2025-03-20 22:05:31.186 [INFO][4146] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7b1afe24140 ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.252488 containerd[1483]: 2025-03-20 22:05:31.211 [INFO][4146] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.252488 containerd[1483]: 2025-03-20 22:05:31.214 [INFO][4146] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b6039ed2-4638-4c6e-9f5f-d86da6536f5a", ResourceVersion:"616", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6", Pod:"csi-node-driver-b4r6v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.2.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7b1afe24140", MAC:"da:fd:c8:10:ce:9d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:31.252488 containerd[1483]: 2025-03-20 22:05:31.241 [INFO][4146] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" Namespace="calico-system" Pod="csi-node-driver-b4r6v" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-csi--node--driver--b4r6v-eth0" Mar 20 22:05:31.268738 containerd[1483]: time="2025-03-20T22:05:31.268119428Z" level=info msg="StartContainer for \"22c56ebe4eb7f14f730fed3dc1c8c7456ce96ac9c2aadab9dd85463aec781fb7\" returns successfully" Mar 20 22:05:31.286218 containerd[1483]: time="2025-03-20T22:05:31.286171223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-262tz,Uid:95da2651-be83-4d81-b360-c868f8c2250b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f\"" Mar 20 22:05:31.288456 containerd[1483]: time="2025-03-20T22:05:31.288428819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 22:05:31.308128 containerd[1483]: time="2025-03-20T22:05:31.307097558Z" level=info msg="connecting to shim b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6" address="unix:///run/containerd/s/f272f0e075121018981c7b98eaf66210eca3b3b744a107d3d0c94ab912212ece" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:31.346271 systemd[1]: Started cri-containerd-b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6.scope - libcontainer container b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6. Mar 20 22:05:31.413348 containerd[1483]: time="2025-03-20T22:05:31.413304031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b4r6v,Uid:b6039ed2-4638-4c6e-9f5f-d86da6536f5a,Namespace:calico-system,Attempt:0,} returns sandbox id \"b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6\"" Mar 20 22:05:31.422754 containerd[1483]: time="2025-03-20T22:05:31.422465308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568f5b4f88-fx6tk,Uid:40290dea-e62b-4abd-9ee1-590a63244daf,Namespace:calico-system,Attempt:0,}" Mar 20 22:05:31.429317 containerd[1483]: time="2025-03-20T22:05:31.429269495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-84f7684999-sdktv,Uid:5e77843c-1514-4098-aab0-d4016a6fe974,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8\"" Mar 20 22:05:31.562526 systemd-networkd[1394]: cali9c93cac73de: Link UP Mar 20 22:05:31.563012 systemd-networkd[1394]: cali9c93cac73de: Gained carrier Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.477 [INFO][4411] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0 calico-kube-controllers-568f5b4f88- calico-system 40290dea-e62b-4abd-9ee1-590a63244daf 702 0 2025-03-20 22:05:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:568f5b4f88 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-2-f-52bc1ad8d1.novalocal calico-kube-controllers-568f5b4f88-fx6tk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9c93cac73de [] []}} ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.477 [INFO][4411] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.514 [INFO][4424] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" HandleID="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.524 [INFO][4424] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" HandleID="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bc160), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-2-f-52bc1ad8d1.novalocal", "pod":"calico-kube-controllers-568f5b4f88-fx6tk", "timestamp":"2025-03-20 22:05:31.51453821 +0000 UTC"}, Hostname:"ci-9999-0-2-f-52bc1ad8d1.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.524 [INFO][4424] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.524 [INFO][4424] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.524 [INFO][4424] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-2-f-52bc1ad8d1.novalocal' Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.526 [INFO][4424] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.530 [INFO][4424] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.535 [INFO][4424] ipam/ipam.go 489: Trying affinity for 192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.537 [INFO][4424] ipam/ipam.go 155: Attempting to load block cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.540 [INFO][4424] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.2.192/26 host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.540 [INFO][4424] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.2.192/26 handle="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.542 [INFO][4424] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393 Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.550 [INFO][4424] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.2.192/26 handle="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.558 [INFO][4424] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.2.198/26] block=192.168.2.192/26 handle="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.558 [INFO][4424] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.2.198/26] handle="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" host="ci-9999-0-2-f-52bc1ad8d1.novalocal" Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.558 [INFO][4424] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 22:05:31.577947 containerd[1483]: 2025-03-20 22:05:31.558 [INFO][4424] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.2.198/26] IPv6=[] ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" HandleID="k8s-pod-network.523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Workload="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.578566 containerd[1483]: 2025-03-20 22:05:31.560 [INFO][4411] cni-plugin/k8s.go 386: Populated endpoint ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0", GenerateName:"calico-kube-controllers-568f5b4f88-", Namespace:"calico-system", SelfLink:"", UID:"40290dea-e62b-4abd-9ee1-590a63244daf", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"568f5b4f88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"", Pod:"calico-kube-controllers-568f5b4f88-fx6tk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.2.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9c93cac73de", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:31.578566 containerd[1483]: 2025-03-20 22:05:31.560 [INFO][4411] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.2.198/32] ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.578566 containerd[1483]: 2025-03-20 22:05:31.560 [INFO][4411] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c93cac73de ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.578566 containerd[1483]: 2025-03-20 22:05:31.563 [INFO][4411] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.578566 containerd[1483]: 2025-03-20 22:05:31.563 [INFO][4411] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0", GenerateName:"calico-kube-controllers-568f5b4f88-", Namespace:"calico-system", SelfLink:"", UID:"40290dea-e62b-4abd-9ee1-590a63244daf", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 22, 5, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"568f5b4f88", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-2-f-52bc1ad8d1.novalocal", ContainerID:"523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393", Pod:"calico-kube-controllers-568f5b4f88-fx6tk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.2.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9c93cac73de", MAC:"4a:f5:f3:da:06:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 22:05:31.578566 containerd[1483]: 2025-03-20 22:05:31.575 [INFO][4411] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" Namespace="calico-system" Pod="calico-kube-controllers-568f5b4f88-fx6tk" WorkloadEndpoint="ci--9999--0--2--f--52bc1ad8d1.novalocal-k8s-calico--kube--controllers--568f5b4f88--fx6tk-eth0" Mar 20 22:05:31.583688 systemd-networkd[1394]: califaef00d3a45: Gained IPv6LL Mar 20 22:05:31.622373 containerd[1483]: time="2025-03-20T22:05:31.620886479Z" level=info msg="connecting to shim 523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393" address="unix:///run/containerd/s/f8d0bc3d9c1448440928de57c17b692ae2f51e3af6612af8594301f865d747d4" namespace=k8s.io protocol=ttrpc version=3 Mar 20 22:05:31.663525 systemd[1]: Started cri-containerd-523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393.scope - libcontainer container 523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393. Mar 20 22:05:31.674271 kubelet[2706]: I0320 22:05:31.674202 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-ctp7t" podStartSLOduration=38.674182469 podStartE2EDuration="38.674182469s" podCreationTimestamp="2025-03-20 22:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 22:05:31.673247246 +0000 UTC m=+43.369131662" watchObservedRunningTime="2025-03-20 22:05:31.674182469 +0000 UTC m=+43.370066875" Mar 20 22:05:31.789841 containerd[1483]: time="2025-03-20T22:05:31.789744595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-568f5b4f88-fx6tk,Uid:40290dea-e62b-4abd-9ee1-590a63244daf,Namespace:calico-system,Attempt:0,} returns sandbox id \"523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393\"" Mar 20 22:05:32.414995 systemd-networkd[1394]: cali3a2e8dd32f0: Gained IPv6LL Mar 20 22:05:32.478819 systemd-networkd[1394]: cali7b1afe24140: Gained IPv6LL Mar 20 22:05:33.118865 systemd-networkd[1394]: cali9c93cac73de: Gained IPv6LL Mar 20 22:05:35.671324 containerd[1483]: time="2025-03-20T22:05:35.671269130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:35.672815 containerd[1483]: time="2025-03-20T22:05:35.672757534Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 20 22:05:35.674299 containerd[1483]: time="2025-03-20T22:05:35.674156350Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:35.677304 containerd[1483]: time="2025-03-20T22:05:35.677191988Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:35.678214 containerd[1483]: time="2025-03-20T22:05:35.677730944Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 4.389257821s" Mar 20 22:05:35.678214 containerd[1483]: time="2025-03-20T22:05:35.677772341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 22:05:35.679686 containerd[1483]: time="2025-03-20T22:05:35.679464209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 20 22:05:35.681213 containerd[1483]: time="2025-03-20T22:05:35.681157860Z" level=info msg="CreateContainer within sandbox \"029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 22:05:35.698081 containerd[1483]: time="2025-03-20T22:05:35.695765871Z" level=info msg="Container 75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:35.710116 containerd[1483]: time="2025-03-20T22:05:35.710082583Z" level=info msg="CreateContainer within sandbox \"029713bc50d19052c1444fe1c840a4621a59396ec981eff40b171038b76f4e1f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f\"" Mar 20 22:05:35.711439 containerd[1483]: time="2025-03-20T22:05:35.711418941Z" level=info msg="StartContainer for \"75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f\"" Mar 20 22:05:35.713054 containerd[1483]: time="2025-03-20T22:05:35.713031288Z" level=info msg="connecting to shim 75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f" address="unix:///run/containerd/s/7bf2d900f312aa5654f958ad227f76e276bf4d4dd96af0da75b704c099c2fc73" protocol=ttrpc version=3 Mar 20 22:05:35.752775 systemd[1]: Started cri-containerd-75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f.scope - libcontainer container 75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f. Mar 20 22:05:35.816162 containerd[1483]: time="2025-03-20T22:05:35.816097511Z" level=info msg="StartContainer for \"75f7c6a589b05f22548bcf94e7ad5fad00e6f39f3905e7422554df8afebb375f\" returns successfully" Mar 20 22:05:36.678563 kubelet[2706]: I0320 22:05:36.678463 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-84f7684999-262tz" podStartSLOduration=32.287044863 podStartE2EDuration="36.678431825s" podCreationTimestamp="2025-03-20 22:05:00 +0000 UTC" firstStartedPulling="2025-03-20 22:05:31.287885125 +0000 UTC m=+42.983769541" lastFinishedPulling="2025-03-20 22:05:35.679272087 +0000 UTC m=+47.375156503" observedRunningTime="2025-03-20 22:05:36.678176535 +0000 UTC m=+48.374061021" watchObservedRunningTime="2025-03-20 22:05:36.678431825 +0000 UTC m=+48.374316282" Mar 20 22:05:37.800394 containerd[1483]: time="2025-03-20T22:05:37.800344352Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:37.801705 containerd[1483]: time="2025-03-20T22:05:37.801636627Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 20 22:05:37.802941 containerd[1483]: time="2025-03-20T22:05:37.802895688Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:37.805228 containerd[1483]: time="2025-03-20T22:05:37.805183487Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:37.806042 containerd[1483]: time="2025-03-20T22:05:37.805838630Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.126345206s" Mar 20 22:05:37.806042 containerd[1483]: time="2025-03-20T22:05:37.805879428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 20 22:05:37.807222 containerd[1483]: time="2025-03-20T22:05:37.807040985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 22:05:37.808274 containerd[1483]: time="2025-03-20T22:05:37.808248731Z" level=info msg="CreateContainer within sandbox \"b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 20 22:05:37.822194 containerd[1483]: time="2025-03-20T22:05:37.820896833Z" level=info msg="Container c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:37.844437 containerd[1483]: time="2025-03-20T22:05:37.844402536Z" level=info msg="CreateContainer within sandbox \"b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876\"" Mar 20 22:05:37.845655 containerd[1483]: time="2025-03-20T22:05:37.845600253Z" level=info msg="StartContainer for \"c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876\"" Mar 20 22:05:37.847721 containerd[1483]: time="2025-03-20T22:05:37.847654161Z" level=info msg="connecting to shim c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876" address="unix:///run/containerd/s/f272f0e075121018981c7b98eaf66210eca3b3b744a107d3d0c94ab912212ece" protocol=ttrpc version=3 Mar 20 22:05:37.873777 systemd[1]: Started cri-containerd-c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876.scope - libcontainer container c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876. Mar 20 22:05:37.918223 containerd[1483]: time="2025-03-20T22:05:37.918180890Z" level=info msg="StartContainer for \"c96a08236c7f0034563b3c9e127ccda2eacc371c303d7ce078eea46372f02876\" returns successfully" Mar 20 22:05:38.314762 containerd[1483]: time="2025-03-20T22:05:38.314581112Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:38.316175 containerd[1483]: time="2025-03-20T22:05:38.316024170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 20 22:05:38.327887 containerd[1483]: time="2025-03-20T22:05:38.327811359Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 520.731129ms" Mar 20 22:05:38.328270 containerd[1483]: time="2025-03-20T22:05:38.328181706Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 22:05:38.331614 containerd[1483]: time="2025-03-20T22:05:38.331558727Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 20 22:05:38.334448 containerd[1483]: time="2025-03-20T22:05:38.334350725Z" level=info msg="CreateContainer within sandbox \"631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 22:05:38.353267 containerd[1483]: time="2025-03-20T22:05:38.353189103Z" level=info msg="Container 3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:38.381728 containerd[1483]: time="2025-03-20T22:05:38.381570524Z" level=info msg="CreateContainer within sandbox \"631f0fbc65326531a768e2666f34b660fa46d17a6fbf777735c34f48e677afb8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9\"" Mar 20 22:05:38.384863 containerd[1483]: time="2025-03-20T22:05:38.383093713Z" level=info msg="StartContainer for \"3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9\"" Mar 20 22:05:38.386521 containerd[1483]: time="2025-03-20T22:05:38.386415188Z" level=info msg="connecting to shim 3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9" address="unix:///run/containerd/s/62a59550b5a698886a0933dc2539d78ba02d65dac4846b0846bb54c8d9887809" protocol=ttrpc version=3 Mar 20 22:05:38.425803 systemd[1]: Started cri-containerd-3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9.scope - libcontainer container 3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9. Mar 20 22:05:38.496022 containerd[1483]: time="2025-03-20T22:05:38.495990843Z" level=info msg="StartContainer for \"3f61a16e5743ebaec5dae1a01237fc2f5523abc993f17c2d3d2414ff07e824a9\" returns successfully" Mar 20 22:05:38.689111 kubelet[2706]: I0320 22:05:38.688792 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-84f7684999-sdktv" podStartSLOduration=31.790172141 podStartE2EDuration="38.688775026s" podCreationTimestamp="2025-03-20 22:05:00 +0000 UTC" firstStartedPulling="2025-03-20 22:05:31.431756612 +0000 UTC m=+43.127641028" lastFinishedPulling="2025-03-20 22:05:38.330359457 +0000 UTC m=+50.026243913" observedRunningTime="2025-03-20 22:05:38.68861751 +0000 UTC m=+50.384501926" watchObservedRunningTime="2025-03-20 22:05:38.688775026 +0000 UTC m=+50.384659442" Mar 20 22:05:39.674217 kubelet[2706]: I0320 22:05:39.674169 2706 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:05:41.604959 containerd[1483]: time="2025-03-20T22:05:41.604881104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:41.606527 containerd[1483]: time="2025-03-20T22:05:41.606312499Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 20 22:05:41.607930 containerd[1483]: time="2025-03-20T22:05:41.607865002Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:41.610462 containerd[1483]: time="2025-03-20T22:05:41.610394334Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:41.611399 containerd[1483]: time="2025-03-20T22:05:41.611008040Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.279046965s" Mar 20 22:05:41.611399 containerd[1483]: time="2025-03-20T22:05:41.611038877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 20 22:05:41.612739 containerd[1483]: time="2025-03-20T22:05:41.612445866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 20 22:05:41.625602 containerd[1483]: time="2025-03-20T22:05:41.625552533Z" level=info msg="CreateContainer within sandbox \"523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 22:05:41.635969 containerd[1483]: time="2025-03-20T22:05:41.635800226Z" level=info msg="Container 934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:41.651835 containerd[1483]: time="2025-03-20T22:05:41.651790291Z" level=info msg="CreateContainer within sandbox \"523a554a834af2e8d7e5992b7206db70f60133a66b6b4df7a8457184e116e393\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\"" Mar 20 22:05:41.652668 containerd[1483]: time="2025-03-20T22:05:41.652551985Z" level=info msg="StartContainer for \"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\"" Mar 20 22:05:41.654198 containerd[1483]: time="2025-03-20T22:05:41.654078979Z" level=info msg="connecting to shim 934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99" address="unix:///run/containerd/s/f8d0bc3d9c1448440928de57c17b692ae2f51e3af6612af8594301f865d747d4" protocol=ttrpc version=3 Mar 20 22:05:41.683800 systemd[1]: Started cri-containerd-934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99.scope - libcontainer container 934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99. Mar 20 22:05:41.737438 containerd[1483]: time="2025-03-20T22:05:41.737393307Z" level=info msg="StartContainer for \"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" returns successfully" Mar 20 22:05:42.721886 kubelet[2706]: I0320 22:05:42.721699 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-568f5b4f88-fx6tk" podStartSLOduration=31.900865125 podStartE2EDuration="41.721612205s" podCreationTimestamp="2025-03-20 22:05:01 +0000 UTC" firstStartedPulling="2025-03-20 22:05:31.791087756 +0000 UTC m=+43.486972172" lastFinishedPulling="2025-03-20 22:05:41.611834846 +0000 UTC m=+53.307719252" observedRunningTime="2025-03-20 22:05:42.717999804 +0000 UTC m=+54.413884290" watchObservedRunningTime="2025-03-20 22:05:42.721612205 +0000 UTC m=+54.417496661" Mar 20 22:05:42.810878 containerd[1483]: time="2025-03-20T22:05:42.810794063Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"4c8c2082cda6d72c3a48720da2ff0d9764d31fb0d318ca50ba4f89a2a3b99fbe\" pid:4674 exited_at:{seconds:1742508342 nanos:809797547}" Mar 20 22:05:44.017456 containerd[1483]: time="2025-03-20T22:05:44.017415667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:44.018779 containerd[1483]: time="2025-03-20T22:05:44.018728658Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 20 22:05:44.020170 containerd[1483]: time="2025-03-20T22:05:44.020123343Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:44.022677 containerd[1483]: time="2025-03-20T22:05:44.022617909Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 22:05:44.023610 containerd[1483]: time="2025-03-20T22:05:44.023486003Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.411001544s" Mar 20 22:05:44.023610 containerd[1483]: time="2025-03-20T22:05:44.023537740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 20 22:05:44.033829 containerd[1483]: time="2025-03-20T22:05:44.033044594Z" level=info msg="CreateContainer within sandbox \"b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 20 22:05:44.046665 containerd[1483]: time="2025-03-20T22:05:44.045924025Z" level=info msg="Container 81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f: CDI devices from CRI Config.CDIDevices: []" Mar 20 22:05:44.062492 containerd[1483]: time="2025-03-20T22:05:44.062367013Z" level=info msg="CreateContainer within sandbox \"b9304da6d8bdf0063b65fc7da59668764c6783489f49f82a55b51f3dbbd3e2d6\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f\"" Mar 20 22:05:44.063075 containerd[1483]: time="2025-03-20T22:05:44.063018781Z" level=info msg="StartContainer for \"81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f\"" Mar 20 22:05:44.064608 containerd[1483]: time="2025-03-20T22:05:44.064529895Z" level=info msg="connecting to shim 81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f" address="unix:///run/containerd/s/f272f0e075121018981c7b98eaf66210eca3b3b744a107d3d0c94ab912212ece" protocol=ttrpc version=3 Mar 20 22:05:44.092799 systemd[1]: Started cri-containerd-81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f.scope - libcontainer container 81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f. Mar 20 22:05:44.141891 containerd[1483]: time="2025-03-20T22:05:44.141790183Z" level=info msg="StartContainer for \"81bab693446dd845cab2a83a9e2a8780329d85fcb53fa33638b182c7af2c1f4f\" returns successfully" Mar 20 22:05:44.551846 kubelet[2706]: I0320 22:05:44.551427 2706 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 20 22:05:44.551846 kubelet[2706]: I0320 22:05:44.551497 2706 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 20 22:05:44.746684 kubelet[2706]: I0320 22:05:44.744910 2706 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-b4r6v" podStartSLOduration=32.137911398 podStartE2EDuration="44.744879128s" podCreationTimestamp="2025-03-20 22:05:00 +0000 UTC" firstStartedPulling="2025-03-20 22:05:31.417433514 +0000 UTC m=+43.113317931" lastFinishedPulling="2025-03-20 22:05:44.024401255 +0000 UTC m=+55.720285661" observedRunningTime="2025-03-20 22:05:44.742495261 +0000 UTC m=+56.438379717" watchObservedRunningTime="2025-03-20 22:05:44.744879128 +0000 UTC m=+56.440763584" Mar 20 22:05:50.389299 containerd[1483]: time="2025-03-20T22:05:50.389251415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"56972566a4412f9c82ae5d011bd99337670f3e016e1206918bced024753c0c75\" pid:4740 exited_at:{seconds:1742508350 nanos:388140015}" Mar 20 22:05:51.789259 containerd[1483]: time="2025-03-20T22:05:51.789179243Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"899a5c7f1787f4a6d819bdeba3b8cb20261b05e683fa56ee9a6e850e37979641\" pid:4765 exited_at:{seconds:1742508351 nanos:788530052}" Mar 20 22:05:55.109256 kubelet[2706]: I0320 22:05:55.108725 2706 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 22:05:57.715290 containerd[1483]: time="2025-03-20T22:05:57.715239772Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"e1b131ae58c7db431522c5e7c98a6eacf3af3db305bf887e24dd6bc61d007f0f\" pid:4791 exited_at:{seconds:1742508357 nanos:714514018}" Mar 20 22:06:20.393464 containerd[1483]: time="2025-03-20T22:06:20.393393000Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"e3d0dcbfff59d3f15c7dcea9b94631029efcd57debbe441572b97e8d0c989865\" pid:4830 exited_at:{seconds:1742508380 nanos:392673088}" Mar 20 22:06:27.735863 containerd[1483]: time="2025-03-20T22:06:27.735765093Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"c0531de4dcfad33994e0adab40c75e8d24c1e015fe79dc7f1a6d92ce31fe63bc\" pid:4855 exited_at:{seconds:1742508387 nanos:735158835}" Mar 20 22:06:50.383968 containerd[1483]: time="2025-03-20T22:06:50.383867120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"12e6612fb86283687f93cce0cc3385ce4dccb6d8f81c16d8da5abca387e289dd\" pid:4889 exited_at:{seconds:1742508410 nanos:383351461}" Mar 20 22:06:51.776220 containerd[1483]: time="2025-03-20T22:06:51.776147056Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"0642accf0a36b2d38103d8a5dfb2525636ca188146c7344e4e9e56ad91c718ca\" pid:4912 exited_at:{seconds:1742508411 nanos:775783154}" Mar 20 22:06:57.721870 containerd[1483]: time="2025-03-20T22:06:57.721790552Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"7ac86a6e0ce14935802cb088d92436b67492c55618ca2710d5e5b3155466f211\" pid:4936 exited_at:{seconds:1742508417 nanos:721325820}" Mar 20 22:07:11.192582 systemd[1]: Started sshd@9-172.24.4.166:22-172.24.4.1:50522.service - OpenSSH per-connection server daemon (172.24.4.1:50522). Mar 20 22:07:12.478437 sshd[4969]: Accepted publickey for core from 172.24.4.1 port 50522 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:12.483112 sshd-session[4969]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:12.496187 systemd-logind[1458]: New session 12 of user core. Mar 20 22:07:12.503687 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 20 22:07:13.253608 sshd[4971]: Connection closed by 172.24.4.1 port 50522 Mar 20 22:07:13.254754 sshd-session[4969]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:13.262687 systemd-logind[1458]: Session 12 logged out. Waiting for processes to exit. Mar 20 22:07:13.264177 systemd[1]: sshd@9-172.24.4.166:22-172.24.4.1:50522.service: Deactivated successfully. Mar 20 22:07:13.269419 systemd[1]: session-12.scope: Deactivated successfully. Mar 20 22:07:13.272257 systemd-logind[1458]: Removed session 12. Mar 20 22:07:18.273937 systemd[1]: Started sshd@10-172.24.4.166:22-172.24.4.1:46186.service - OpenSSH per-connection server daemon (172.24.4.1:46186). Mar 20 22:07:19.421701 sshd[4983]: Accepted publickey for core from 172.24.4.1 port 46186 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:19.424367 sshd-session[4983]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:19.436740 systemd-logind[1458]: New session 13 of user core. Mar 20 22:07:19.442972 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 20 22:07:20.168803 sshd[4985]: Connection closed by 172.24.4.1 port 46186 Mar 20 22:07:20.169925 sshd-session[4983]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:20.177933 systemd[1]: sshd@10-172.24.4.166:22-172.24.4.1:46186.service: Deactivated successfully. Mar 20 22:07:20.182595 systemd[1]: session-13.scope: Deactivated successfully. Mar 20 22:07:20.185010 systemd-logind[1458]: Session 13 logged out. Waiting for processes to exit. Mar 20 22:07:20.187193 systemd-logind[1458]: Removed session 13. Mar 20 22:07:20.395093 containerd[1483]: time="2025-03-20T22:07:20.394999843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"f0aa8315b7b747a7ad9c1d67157bdd44201312c6017d189f06e55c1fbd3e7c1c\" pid:5009 exited_at:{seconds:1742508440 nanos:393137728}" Mar 20 22:07:25.189361 systemd[1]: Started sshd@11-172.24.4.166:22-172.24.4.1:43760.service - OpenSSH per-connection server daemon (172.24.4.1:43760). Mar 20 22:07:26.398033 sshd[5021]: Accepted publickey for core from 172.24.4.1 port 43760 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:26.401159 sshd-session[5021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:26.414787 systemd-logind[1458]: New session 14 of user core. Mar 20 22:07:26.419945 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 20 22:07:27.379732 sshd[5023]: Connection closed by 172.24.4.1 port 43760 Mar 20 22:07:27.382372 sshd-session[5021]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:27.395550 systemd[1]: sshd@11-172.24.4.166:22-172.24.4.1:43760.service: Deactivated successfully. Mar 20 22:07:27.399734 systemd[1]: session-14.scope: Deactivated successfully. Mar 20 22:07:27.401780 systemd-logind[1458]: Session 14 logged out. Waiting for processes to exit. Mar 20 22:07:27.406877 systemd[1]: Started sshd@12-172.24.4.166:22-172.24.4.1:43776.service - OpenSSH per-connection server daemon (172.24.4.1:43776). Mar 20 22:07:27.410749 systemd-logind[1458]: Removed session 14. Mar 20 22:07:27.731613 containerd[1483]: time="2025-03-20T22:07:27.731455942Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"a912f12aaf32fca774378964e6e56bec6f119f687f2f1f197f7197de2c3d4b38\" pid:5050 exited_at:{seconds:1742508447 nanos:730978626}" Mar 20 22:07:28.574284 sshd[5035]: Accepted publickey for core from 172.24.4.1 port 43776 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:28.580370 sshd-session[5035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:28.593399 systemd-logind[1458]: New session 15 of user core. Mar 20 22:07:28.604968 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 20 22:07:29.398106 sshd[5062]: Connection closed by 172.24.4.1 port 43776 Mar 20 22:07:29.397255 sshd-session[5035]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:29.418867 systemd[1]: sshd@12-172.24.4.166:22-172.24.4.1:43776.service: Deactivated successfully. Mar 20 22:07:29.422573 systemd[1]: session-15.scope: Deactivated successfully. Mar 20 22:07:29.424335 systemd-logind[1458]: Session 15 logged out. Waiting for processes to exit. Mar 20 22:07:29.427877 systemd[1]: Started sshd@13-172.24.4.166:22-172.24.4.1:43786.service - OpenSSH per-connection server daemon (172.24.4.1:43786). Mar 20 22:07:29.429817 systemd-logind[1458]: Removed session 15. Mar 20 22:07:30.481671 sshd[5070]: Accepted publickey for core from 172.24.4.1 port 43786 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:30.484295 sshd-session[5070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:30.495341 systemd-logind[1458]: New session 16 of user core. Mar 20 22:07:30.502903 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 20 22:07:31.229784 sshd[5073]: Connection closed by 172.24.4.1 port 43786 Mar 20 22:07:31.229427 sshd-session[5070]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:31.237347 systemd-logind[1458]: Session 16 logged out. Waiting for processes to exit. Mar 20 22:07:31.239720 systemd[1]: sshd@13-172.24.4.166:22-172.24.4.1:43786.service: Deactivated successfully. Mar 20 22:07:31.246476 systemd[1]: session-16.scope: Deactivated successfully. Mar 20 22:07:31.249166 systemd-logind[1458]: Removed session 16. Mar 20 22:07:36.252927 systemd[1]: Started sshd@14-172.24.4.166:22-172.24.4.1:53608.service - OpenSSH per-connection server daemon (172.24.4.1:53608). Mar 20 22:07:37.570233 sshd[5088]: Accepted publickey for core from 172.24.4.1 port 53608 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:37.573052 sshd-session[5088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:37.584784 systemd-logind[1458]: New session 17 of user core. Mar 20 22:07:37.591956 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 20 22:07:38.193414 sshd[5090]: Connection closed by 172.24.4.1 port 53608 Mar 20 22:07:38.192191 sshd-session[5088]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:38.200459 systemd[1]: sshd@14-172.24.4.166:22-172.24.4.1:53608.service: Deactivated successfully. Mar 20 22:07:38.206603 systemd[1]: session-17.scope: Deactivated successfully. Mar 20 22:07:38.212418 systemd-logind[1458]: Session 17 logged out. Waiting for processes to exit. Mar 20 22:07:38.215150 systemd-logind[1458]: Removed session 17. Mar 20 22:07:43.211554 systemd[1]: Started sshd@15-172.24.4.166:22-172.24.4.1:53622.service - OpenSSH per-connection server daemon (172.24.4.1:53622). Mar 20 22:07:44.604249 sshd[5103]: Accepted publickey for core from 172.24.4.1 port 53622 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:44.608038 sshd-session[5103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:44.621510 systemd-logind[1458]: New session 18 of user core. Mar 20 22:07:44.632990 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 20 22:07:45.352482 sshd[5105]: Connection closed by 172.24.4.1 port 53622 Mar 20 22:07:45.353389 sshd-session[5103]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:45.357419 systemd-logind[1458]: Session 18 logged out. Waiting for processes to exit. Mar 20 22:07:45.358188 systemd[1]: sshd@15-172.24.4.166:22-172.24.4.1:53622.service: Deactivated successfully. Mar 20 22:07:45.361423 systemd[1]: session-18.scope: Deactivated successfully. Mar 20 22:07:45.364667 systemd-logind[1458]: Removed session 18. Mar 20 22:07:50.376126 systemd[1]: Started sshd@16-172.24.4.166:22-172.24.4.1:48190.service - OpenSSH per-connection server daemon (172.24.4.1:48190). Mar 20 22:07:50.385787 containerd[1483]: time="2025-03-20T22:07:50.385736081Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"dbec7495475e49e2641cef5f57deb30f69da2405592935dc12331241cd317356\" pid:5131 exited_at:{seconds:1742508470 nanos:383612709}" Mar 20 22:07:51.500866 sshd[5138]: Accepted publickey for core from 172.24.4.1 port 48190 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:51.503569 sshd-session[5138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:51.514764 systemd-logind[1458]: New session 19 of user core. Mar 20 22:07:51.519926 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 20 22:07:51.787166 containerd[1483]: time="2025-03-20T22:07:51.785872098Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"c10fbd6a4a212d98ab859051f05cfba480ed94fa5c74d1b561e45816a8bc38a4\" pid:5157 exited_at:{seconds:1742508471 nanos:783904288}" Mar 20 22:07:52.166242 sshd[5143]: Connection closed by 172.24.4.1 port 48190 Mar 20 22:07:52.165616 sshd-session[5138]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:52.186425 systemd[1]: sshd@16-172.24.4.166:22-172.24.4.1:48190.service: Deactivated successfully. Mar 20 22:07:52.192048 systemd[1]: session-19.scope: Deactivated successfully. Mar 20 22:07:52.194663 systemd-logind[1458]: Session 19 logged out. Waiting for processes to exit. Mar 20 22:07:52.200518 systemd[1]: Started sshd@17-172.24.4.166:22-172.24.4.1:48200.service - OpenSSH per-connection server daemon (172.24.4.1:48200). Mar 20 22:07:52.204536 systemd-logind[1458]: Removed session 19. Mar 20 22:07:53.354137 sshd[5175]: Accepted publickey for core from 172.24.4.1 port 48200 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:53.356809 sshd-session[5175]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:53.370069 systemd-logind[1458]: New session 20 of user core. Mar 20 22:07:53.383955 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 20 22:07:54.419682 sshd[5178]: Connection closed by 172.24.4.1 port 48200 Mar 20 22:07:54.420590 sshd-session[5175]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:54.439196 systemd[1]: sshd@17-172.24.4.166:22-172.24.4.1:48200.service: Deactivated successfully. Mar 20 22:07:54.443148 systemd[1]: session-20.scope: Deactivated successfully. Mar 20 22:07:54.445248 systemd-logind[1458]: Session 20 logged out. Waiting for processes to exit. Mar 20 22:07:54.452195 systemd[1]: Started sshd@18-172.24.4.166:22-172.24.4.1:47172.service - OpenSSH per-connection server daemon (172.24.4.1:47172). Mar 20 22:07:54.454811 systemd-logind[1458]: Removed session 20. Mar 20 22:07:55.647020 sshd[5189]: Accepted publickey for core from 172.24.4.1 port 47172 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:07:55.650341 sshd-session[5189]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:07:55.662745 systemd-logind[1458]: New session 21 of user core. Mar 20 22:07:55.676046 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 20 22:07:57.715058 containerd[1483]: time="2025-03-20T22:07:57.715021928Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"5006a389444a94f348f85d07ee1f62ffec1ab38e29e34dbc92725bcffa62e379\" pid:5214 exited_at:{seconds:1742508477 nanos:714735850}" Mar 20 22:07:58.797457 sshd[5192]: Connection closed by 172.24.4.1 port 47172 Mar 20 22:07:58.799310 sshd-session[5189]: pam_unix(sshd:session): session closed for user core Mar 20 22:07:58.815035 systemd[1]: sshd@18-172.24.4.166:22-172.24.4.1:47172.service: Deactivated successfully. Mar 20 22:07:58.819393 systemd[1]: session-21.scope: Deactivated successfully. Mar 20 22:07:58.820025 systemd[1]: session-21.scope: Consumed 926ms CPU time, 69.8M memory peak. Mar 20 22:07:58.821853 systemd-logind[1458]: Session 21 logged out. Waiting for processes to exit. Mar 20 22:07:58.827517 systemd[1]: Started sshd@19-172.24.4.166:22-172.24.4.1:47180.service - OpenSSH per-connection server daemon (172.24.4.1:47180). Mar 20 22:07:58.831854 systemd-logind[1458]: Removed session 21. Mar 20 22:08:00.211297 sshd[5232]: Accepted publickey for core from 172.24.4.1 port 47180 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:00.214295 sshd-session[5232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:00.226722 systemd-logind[1458]: New session 22 of user core. Mar 20 22:08:00.234965 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 20 22:08:01.359824 sshd[5235]: Connection closed by 172.24.4.1 port 47180 Mar 20 22:08:01.360809 sshd-session[5232]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:01.372334 systemd[1]: sshd@19-172.24.4.166:22-172.24.4.1:47180.service: Deactivated successfully. Mar 20 22:08:01.377103 systemd[1]: session-22.scope: Deactivated successfully. Mar 20 22:08:01.381001 systemd-logind[1458]: Session 22 logged out. Waiting for processes to exit. Mar 20 22:08:01.385341 systemd[1]: Started sshd@20-172.24.4.166:22-172.24.4.1:47184.service - OpenSSH per-connection server daemon (172.24.4.1:47184). Mar 20 22:08:01.387079 systemd-logind[1458]: Removed session 22. Mar 20 22:08:02.588882 sshd[5245]: Accepted publickey for core from 172.24.4.1 port 47184 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:02.591933 sshd-session[5245]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:02.605605 systemd-logind[1458]: New session 23 of user core. Mar 20 22:08:02.613975 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 20 22:08:03.335779 sshd[5248]: Connection closed by 172.24.4.1 port 47184 Mar 20 22:08:03.337060 sshd-session[5245]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:03.344885 systemd[1]: sshd@20-172.24.4.166:22-172.24.4.1:47184.service: Deactivated successfully. Mar 20 22:08:03.348941 systemd[1]: session-23.scope: Deactivated successfully. Mar 20 22:08:03.351218 systemd-logind[1458]: Session 23 logged out. Waiting for processes to exit. Mar 20 22:08:03.354285 systemd-logind[1458]: Removed session 23. Mar 20 22:08:08.357841 systemd[1]: Started sshd@21-172.24.4.166:22-172.24.4.1:43086.service - OpenSSH per-connection server daemon (172.24.4.1:43086). Mar 20 22:08:09.529175 sshd[5269]: Accepted publickey for core from 172.24.4.1 port 43086 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:09.532059 sshd-session[5269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:09.551021 systemd-logind[1458]: New session 24 of user core. Mar 20 22:08:09.559013 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 20 22:08:10.140485 sshd[5271]: Connection closed by 172.24.4.1 port 43086 Mar 20 22:08:10.141993 sshd-session[5269]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:10.151607 systemd[1]: sshd@21-172.24.4.166:22-172.24.4.1:43086.service: Deactivated successfully. Mar 20 22:08:10.156473 systemd[1]: session-24.scope: Deactivated successfully. Mar 20 22:08:10.159440 systemd-logind[1458]: Session 24 logged out. Waiting for processes to exit. Mar 20 22:08:10.162965 systemd-logind[1458]: Removed session 24. Mar 20 22:08:15.166546 systemd[1]: Started sshd@22-172.24.4.166:22-172.24.4.1:52712.service - OpenSSH per-connection server daemon (172.24.4.1:52712). Mar 20 22:08:16.385685 sshd[5285]: Accepted publickey for core from 172.24.4.1 port 52712 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:16.388619 sshd-session[5285]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:16.401133 systemd-logind[1458]: New session 25 of user core. Mar 20 22:08:16.409967 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 20 22:08:17.176741 sshd[5287]: Connection closed by 172.24.4.1 port 52712 Mar 20 22:08:17.177912 sshd-session[5285]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:17.185707 systemd[1]: sshd@22-172.24.4.166:22-172.24.4.1:52712.service: Deactivated successfully. Mar 20 22:08:17.190589 systemd[1]: session-25.scope: Deactivated successfully. Mar 20 22:08:17.192855 systemd-logind[1458]: Session 25 logged out. Waiting for processes to exit. Mar 20 22:08:17.195044 systemd-logind[1458]: Removed session 25. Mar 20 22:08:20.385082 containerd[1483]: time="2025-03-20T22:08:20.385035228Z" level=info msg="TaskExit event in podsandbox handler container_id:\"934f7cfc32821da7a8a238a078321871afa70b5dddecfc2efb769e2edbe73c99\" id:\"0b53fb7ecab6ba21cdfd21b8a0b7d4c8b16f722d49403515511a023d555817a5\" pid:5311 exited_at:{seconds:1742508500 nanos:384542902}" Mar 20 22:08:22.198896 systemd[1]: Started sshd@23-172.24.4.166:22-172.24.4.1:52728.service - OpenSSH per-connection server daemon (172.24.4.1:52728). Mar 20 22:08:23.404181 sshd[5322]: Accepted publickey for core from 172.24.4.1 port 52728 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:23.406911 sshd-session[5322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:23.418362 systemd-logind[1458]: New session 26 of user core. Mar 20 22:08:23.422918 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 20 22:08:24.172563 sshd[5324]: Connection closed by 172.24.4.1 port 52728 Mar 20 22:08:24.173894 sshd-session[5322]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:24.181523 systemd[1]: sshd@23-172.24.4.166:22-172.24.4.1:52728.service: Deactivated successfully. Mar 20 22:08:24.185592 systemd[1]: session-26.scope: Deactivated successfully. Mar 20 22:08:24.188179 systemd-logind[1458]: Session 26 logged out. Waiting for processes to exit. Mar 20 22:08:24.190935 systemd-logind[1458]: Removed session 26. Mar 20 22:08:27.725073 containerd[1483]: time="2025-03-20T22:08:27.724535668Z" level=info msg="TaskExit event in podsandbox handler container_id:\"059f8854f5fe4e9d2ec7b8ba29318bdc616f9c0bce9ec01bafaea549ebaccdbb\" id:\"73d0b0f53a60722d9456a836f50308a8abe59ffc9fdefdf971cb22a7c87fa198\" pid:5350 exited_at:{seconds:1742508507 nanos:724106121}" Mar 20 22:08:29.195684 systemd[1]: Started sshd@24-172.24.4.166:22-172.24.4.1:46594.service - OpenSSH per-connection server daemon (172.24.4.1:46594). Mar 20 22:08:30.604964 sshd[5363]: Accepted publickey for core from 172.24.4.1 port 46594 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:30.609857 sshd-session[5363]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:30.623177 systemd-logind[1458]: New session 27 of user core. Mar 20 22:08:30.630937 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 20 22:08:31.524532 sshd[5365]: Connection closed by 172.24.4.1 port 46594 Mar 20 22:08:31.525935 sshd-session[5363]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:31.534935 systemd[1]: sshd@24-172.24.4.166:22-172.24.4.1:46594.service: Deactivated successfully. Mar 20 22:08:31.541102 systemd[1]: session-27.scope: Deactivated successfully. Mar 20 22:08:31.544350 systemd-logind[1458]: Session 27 logged out. Waiting for processes to exit. Mar 20 22:08:31.547026 systemd-logind[1458]: Removed session 27. Mar 20 22:08:36.546280 systemd[1]: Started sshd@25-172.24.4.166:22-172.24.4.1:46378.service - OpenSSH per-connection server daemon (172.24.4.1:46378). Mar 20 22:08:37.713763 sshd[5393]: Accepted publickey for core from 172.24.4.1 port 46378 ssh2: RSA SHA256:luFDhjljDAkziPUt17AfotpNzGBo7u8zEz4RZJQIe48 Mar 20 22:08:37.716982 sshd-session[5393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 22:08:37.728514 systemd-logind[1458]: New session 28 of user core. Mar 20 22:08:37.735962 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 20 22:08:38.459585 sshd[5396]: Connection closed by 172.24.4.1 port 46378 Mar 20 22:08:38.460713 sshd-session[5393]: pam_unix(sshd:session): session closed for user core Mar 20 22:08:38.467910 systemd[1]: sshd@25-172.24.4.166:22-172.24.4.1:46378.service: Deactivated successfully. Mar 20 22:08:38.472431 systemd[1]: session-28.scope: Deactivated successfully. Mar 20 22:08:38.474432 systemd-logind[1458]: Session 28 logged out. Waiting for processes to exit. Mar 20 22:08:38.476572 systemd-logind[1458]: Removed session 28.