Jul 9 14:56:14.093372 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed Jul 9 08:38:39 -00 2025 Jul 9 14:56:14.093402 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f85d3be94c634d7d72fbcd0e670073ce56ae2e0cc763f83b329300b7cea5203d Jul 9 14:56:14.093413 kernel: BIOS-provided physical RAM map: Jul 9 14:56:14.093424 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jul 9 14:56:14.093433 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jul 9 14:56:14.093441 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jul 9 14:56:14.093451 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Jul 9 14:56:14.093467 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Jul 9 14:56:14.093476 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jul 9 14:56:14.093485 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jul 9 14:56:14.093493 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Jul 9 14:56:14.093502 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jul 9 14:56:14.093513 kernel: NX (Execute Disable) protection: active Jul 9 14:56:14.093522 kernel: APIC: Static calls initialized Jul 9 14:56:14.093532 kernel: SMBIOS 3.0.0 present. Jul 9 14:56:14.093541 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Jul 9 14:56:14.093555 kernel: DMI: Memory slots populated: 1/1 Jul 9 14:56:14.093566 kernel: Hypervisor detected: KVM Jul 9 14:56:14.093575 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jul 9 14:56:14.093584 kernel: kvm-clock: using sched offset of 5937943239 cycles Jul 9 14:56:14.093594 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jul 9 14:56:14.093604 kernel: tsc: Detected 1996.249 MHz processor Jul 9 14:56:14.093613 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 9 14:56:14.093623 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 9 14:56:14.093632 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Jul 9 14:56:14.093642 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jul 9 14:56:14.093653 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 9 14:56:14.093663 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Jul 9 14:56:14.093672 kernel: ACPI: Early table checksum verification disabled Jul 9 14:56:14.093681 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Jul 9 14:56:14.093691 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 9 14:56:14.093700 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 9 14:56:14.094622 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 9 14:56:14.094635 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Jul 9 14:56:14.094644 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 9 14:56:14.094659 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 9 14:56:14.094668 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Jul 9 14:56:14.094677 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Jul 9 14:56:14.094687 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Jul 9 14:56:14.094696 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Jul 9 14:56:14.096157 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Jul 9 14:56:14.096175 kernel: No NUMA configuration found Jul 9 14:56:14.096185 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Jul 9 14:56:14.096194 kernel: NODE_DATA(0) allocated [mem 0x13fff5dc0-0x13fffcfff] Jul 9 14:56:14.096204 kernel: Zone ranges: Jul 9 14:56:14.096214 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 9 14:56:14.096224 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Jul 9 14:56:14.096233 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Jul 9 14:56:14.096243 kernel: Device empty Jul 9 14:56:14.096254 kernel: Movable zone start for each node Jul 9 14:56:14.096349 kernel: Early memory node ranges Jul 9 14:56:14.096360 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jul 9 14:56:14.096370 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Jul 9 14:56:14.096379 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Jul 9 14:56:14.096389 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Jul 9 14:56:14.096399 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 9 14:56:14.096409 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jul 9 14:56:14.096419 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Jul 9 14:56:14.096432 kernel: ACPI: PM-Timer IO Port: 0x608 Jul 9 14:56:14.096442 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jul 9 14:56:14.096451 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jul 9 14:56:14.096469 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jul 9 14:56:14.096479 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jul 9 14:56:14.096492 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jul 9 14:56:14.096502 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jul 9 14:56:14.096512 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jul 9 14:56:14.096527 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 9 14:56:14.096539 kernel: CPU topo: Max. logical packages: 2 Jul 9 14:56:14.096549 kernel: CPU topo: Max. logical dies: 2 Jul 9 14:56:14.096558 kernel: CPU topo: Max. dies per package: 1 Jul 9 14:56:14.096568 kernel: CPU topo: Max. threads per core: 1 Jul 9 14:56:14.096577 kernel: CPU topo: Num. cores per package: 1 Jul 9 14:56:14.096587 kernel: CPU topo: Num. threads per package: 1 Jul 9 14:56:14.096597 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jul 9 14:56:14.096607 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jul 9 14:56:14.096616 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Jul 9 14:56:14.096626 kernel: Booting paravirtualized kernel on KVM Jul 9 14:56:14.096638 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 9 14:56:14.096648 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jul 9 14:56:14.096658 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jul 9 14:56:14.096668 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jul 9 14:56:14.096677 kernel: pcpu-alloc: [0] 0 1 Jul 9 14:56:14.096686 kernel: kvm-guest: PV spinlocks disabled, no host support Jul 9 14:56:14.096698 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f85d3be94c634d7d72fbcd0e670073ce56ae2e0cc763f83b329300b7cea5203d Jul 9 14:56:14.096782 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 9 14:56:14.096798 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 9 14:56:14.096808 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 9 14:56:14.096818 kernel: Fallback order for Node 0: 0 Jul 9 14:56:14.096828 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1048443 Jul 9 14:56:14.096837 kernel: Policy zone: Normal Jul 9 14:56:14.096847 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 9 14:56:14.096857 kernel: software IO TLB: area num 2. Jul 9 14:56:14.096866 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 9 14:56:14.096876 kernel: ftrace: allocating 40097 entries in 157 pages Jul 9 14:56:14.096887 kernel: ftrace: allocated 157 pages with 5 groups Jul 9 14:56:14.096897 kernel: Dynamic Preempt: voluntary Jul 9 14:56:14.096906 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 9 14:56:14.096917 kernel: rcu: RCU event tracing is enabled. Jul 9 14:56:14.096926 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 9 14:56:14.096936 kernel: Trampoline variant of Tasks RCU enabled. Jul 9 14:56:14.096946 kernel: Rude variant of Tasks RCU enabled. Jul 9 14:56:14.096955 kernel: Tracing variant of Tasks RCU enabled. Jul 9 14:56:14.096965 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 9 14:56:14.096977 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 9 14:56:14.096987 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 9 14:56:14.096997 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 9 14:56:14.097006 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 9 14:56:14.097016 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jul 9 14:56:14.097026 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 9 14:56:14.097041 kernel: Console: colour VGA+ 80x25 Jul 9 14:56:14.097051 kernel: printk: legacy console [tty0] enabled Jul 9 14:56:14.097060 kernel: printk: legacy console [ttyS0] enabled Jul 9 14:56:14.097072 kernel: ACPI: Core revision 20240827 Jul 9 14:56:14.097081 kernel: APIC: Switch to symmetric I/O mode setup Jul 9 14:56:14.097091 kernel: x2apic enabled Jul 9 14:56:14.097100 kernel: APIC: Switched APIC routing to: physical x2apic Jul 9 14:56:14.097110 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 9 14:56:14.097120 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Jul 9 14:56:14.097136 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Jul 9 14:56:14.097148 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jul 9 14:56:14.097158 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jul 9 14:56:14.097168 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 9 14:56:14.097178 kernel: Spectre V2 : Mitigation: Retpolines Jul 9 14:56:14.097194 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jul 9 14:56:14.097207 kernel: Speculative Store Bypass: Vulnerable Jul 9 14:56:14.097217 kernel: x86/fpu: x87 FPU will use FXSAVE Jul 9 14:56:14.097227 kernel: Freeing SMP alternatives memory: 32K Jul 9 14:56:14.097237 kernel: pid_max: default: 32768 minimum: 301 Jul 9 14:56:14.097249 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 9 14:56:14.097259 kernel: landlock: Up and running. Jul 9 14:56:14.097269 kernel: SELinux: Initializing. Jul 9 14:56:14.097280 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 9 14:56:14.097290 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 9 14:56:14.097301 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Jul 9 14:56:14.097311 kernel: Performance Events: AMD PMU driver. Jul 9 14:56:14.097321 kernel: ... version: 0 Jul 9 14:56:14.097331 kernel: ... bit width: 48 Jul 9 14:56:14.097343 kernel: ... generic registers: 4 Jul 9 14:56:14.097353 kernel: ... value mask: 0000ffffffffffff Jul 9 14:56:14.097363 kernel: ... max period: 00007fffffffffff Jul 9 14:56:14.097373 kernel: ... fixed-purpose events: 0 Jul 9 14:56:14.097383 kernel: ... event mask: 000000000000000f Jul 9 14:56:14.097393 kernel: signal: max sigframe size: 1440 Jul 9 14:56:14.097403 kernel: rcu: Hierarchical SRCU implementation. Jul 9 14:56:14.097413 kernel: rcu: Max phase no-delay instances is 400. Jul 9 14:56:14.097423 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 9 14:56:14.097433 kernel: smp: Bringing up secondary CPUs ... Jul 9 14:56:14.097445 kernel: smpboot: x86: Booting SMP configuration: Jul 9 14:56:14.097456 kernel: .... node #0, CPUs: #1 Jul 9 14:56:14.097466 kernel: smp: Brought up 1 node, 2 CPUs Jul 9 14:56:14.097476 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Jul 9 14:56:14.097486 kernel: Memory: 3961276K/4193772K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54568K init, 2400K bss, 227296K reserved, 0K cma-reserved) Jul 9 14:56:14.097496 kernel: devtmpfs: initialized Jul 9 14:56:14.097506 kernel: x86/mm: Memory block size: 128MB Jul 9 14:56:14.097517 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 9 14:56:14.097527 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 9 14:56:14.097544 kernel: pinctrl core: initialized pinctrl subsystem Jul 9 14:56:14.097554 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 9 14:56:14.097564 kernel: audit: initializing netlink subsys (disabled) Jul 9 14:56:14.097574 kernel: audit: type=2000 audit(1752072969.547:1): state=initialized audit_enabled=0 res=1 Jul 9 14:56:14.097584 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 9 14:56:14.097594 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 9 14:56:14.097604 kernel: cpuidle: using governor menu Jul 9 14:56:14.097614 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 9 14:56:14.097625 kernel: dca service started, version 1.12.1 Jul 9 14:56:14.097638 kernel: PCI: Using configuration type 1 for base access Jul 9 14:56:14.097648 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 9 14:56:14.097659 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 9 14:56:14.097669 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 9 14:56:14.097679 kernel: ACPI: Added _OSI(Module Device) Jul 9 14:56:14.097689 kernel: ACPI: Added _OSI(Processor Device) Jul 9 14:56:14.097700 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 9 14:56:14.098929 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 9 14:56:14.098942 kernel: ACPI: Interpreter enabled Jul 9 14:56:14.098957 kernel: ACPI: PM: (supports S0 S3 S5) Jul 9 14:56:14.098966 kernel: ACPI: Using IOAPIC for interrupt routing Jul 9 14:56:14.098976 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 9 14:56:14.098985 kernel: PCI: Using E820 reservations for host bridge windows Jul 9 14:56:14.098995 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Jul 9 14:56:14.099004 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 9 14:56:14.099202 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Jul 9 14:56:14.099312 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Jul 9 14:56:14.099421 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Jul 9 14:56:14.099437 kernel: acpiphp: Slot [3] registered Jul 9 14:56:14.099447 kernel: acpiphp: Slot [4] registered Jul 9 14:56:14.099457 kernel: acpiphp: Slot [5] registered Jul 9 14:56:14.099476 kernel: acpiphp: Slot [6] registered Jul 9 14:56:14.099487 kernel: acpiphp: Slot [7] registered Jul 9 14:56:14.099497 kernel: acpiphp: Slot [8] registered Jul 9 14:56:14.099507 kernel: acpiphp: Slot [9] registered Jul 9 14:56:14.099522 kernel: acpiphp: Slot [10] registered Jul 9 14:56:14.099532 kernel: acpiphp: Slot [11] registered Jul 9 14:56:14.099542 kernel: acpiphp: Slot [12] registered Jul 9 14:56:14.099552 kernel: acpiphp: Slot [13] registered Jul 9 14:56:14.099562 kernel: acpiphp: Slot [14] registered Jul 9 14:56:14.099572 kernel: acpiphp: Slot [15] registered Jul 9 14:56:14.099582 kernel: acpiphp: Slot [16] registered Jul 9 14:56:14.099592 kernel: acpiphp: Slot [17] registered Jul 9 14:56:14.099602 kernel: acpiphp: Slot [18] registered Jul 9 14:56:14.099614 kernel: acpiphp: Slot [19] registered Jul 9 14:56:14.099624 kernel: acpiphp: Slot [20] registered Jul 9 14:56:14.099634 kernel: acpiphp: Slot [21] registered Jul 9 14:56:14.099648 kernel: acpiphp: Slot [22] registered Jul 9 14:56:14.099659 kernel: acpiphp: Slot [23] registered Jul 9 14:56:14.099669 kernel: acpiphp: Slot [24] registered Jul 9 14:56:14.099679 kernel: acpiphp: Slot [25] registered Jul 9 14:56:14.099689 kernel: acpiphp: Slot [26] registered Jul 9 14:56:14.099699 kernel: acpiphp: Slot [27] registered Jul 9 14:56:14.099751 kernel: acpiphp: Slot [28] registered Jul 9 14:56:14.099766 kernel: acpiphp: Slot [29] registered Jul 9 14:56:14.099776 kernel: acpiphp: Slot [30] registered Jul 9 14:56:14.099786 kernel: acpiphp: Slot [31] registered Jul 9 14:56:14.099796 kernel: PCI host bridge to bus 0000:00 Jul 9 14:56:14.099930 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 9 14:56:14.100029 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jul 9 14:56:14.100121 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 9 14:56:14.100826 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 9 14:56:14.100927 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Jul 9 14:56:14.101019 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 9 14:56:14.101159 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Jul 9 14:56:14.101314 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint Jul 9 14:56:14.101441 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint Jul 9 14:56:14.101547 kernel: pci 0000:00:01.1: BAR 4 [io 0xc120-0xc12f] Jul 9 14:56:14.101656 kernel: pci 0000:00:01.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk Jul 9 14:56:14.104725 kernel: pci 0000:00:01.1: BAR 1 [io 0x03f6]: legacy IDE quirk Jul 9 14:56:14.104842 kernel: pci 0000:00:01.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk Jul 9 14:56:14.104944 kernel: pci 0000:00:01.1: BAR 3 [io 0x0376]: legacy IDE quirk Jul 9 14:56:14.105076 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Jul 9 14:56:14.105182 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Jul 9 14:56:14.105295 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Jul 9 14:56:14.105426 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jul 9 14:56:14.105533 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref] Jul 9 14:56:14.105637 kernel: pci 0000:00:02.0: BAR 2 [mem 0xc000000000-0xc000003fff 64bit pref] Jul 9 14:56:14.105769 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff] Jul 9 14:56:14.105876 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref] Jul 9 14:56:14.105981 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 9 14:56:14.106123 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Jul 9 14:56:14.106222 kernel: pci 0000:00:03.0: BAR 0 [io 0xc080-0xc0bf] Jul 9 14:56:14.106318 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff] Jul 9 14:56:14.106414 kernel: pci 0000:00:03.0: BAR 4 [mem 0xc000004000-0xc000007fff 64bit pref] Jul 9 14:56:14.106509 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref] Jul 9 14:56:14.106650 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Jul 9 14:56:14.106779 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] Jul 9 14:56:14.106885 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff] Jul 9 14:56:14.106981 kernel: pci 0000:00:04.0: BAR 4 [mem 0xc000008000-0xc00000bfff 64bit pref] Jul 9 14:56:14.107099 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint Jul 9 14:56:14.107213 kernel: pci 0000:00:05.0: BAR 0 [io 0xc0c0-0xc0ff] Jul 9 14:56:14.107311 kernel: pci 0000:00:05.0: BAR 4 [mem 0xc00000c000-0xc00000ffff 64bit pref] Jul 9 14:56:14.107429 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Jul 9 14:56:14.107535 kernel: pci 0000:00:06.0: BAR 0 [io 0xc100-0xc11f] Jul 9 14:56:14.107637 kernel: pci 0000:00:06.0: BAR 1 [mem 0xfeb93000-0xfeb93fff] Jul 9 14:56:14.107780 kernel: pci 0000:00:06.0: BAR 4 [mem 0xc000010000-0xc000013fff 64bit pref] Jul 9 14:56:14.107797 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jul 9 14:56:14.107807 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jul 9 14:56:14.107816 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 9 14:56:14.107826 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jul 9 14:56:14.107835 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Jul 9 14:56:14.107845 kernel: iommu: Default domain type: Translated Jul 9 14:56:14.107859 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 9 14:56:14.107869 kernel: PCI: Using ACPI for IRQ routing Jul 9 14:56:14.107878 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 9 14:56:14.107888 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jul 9 14:56:14.107898 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Jul 9 14:56:14.107998 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Jul 9 14:56:14.108094 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Jul 9 14:56:14.108191 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 9 14:56:14.108205 kernel: vgaarb: loaded Jul 9 14:56:14.108218 kernel: clocksource: Switched to clocksource kvm-clock Jul 9 14:56:14.108228 kernel: VFS: Disk quotas dquot_6.6.0 Jul 9 14:56:14.108237 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 9 14:56:14.108247 kernel: pnp: PnP ACPI init Jul 9 14:56:14.108381 kernel: pnp 00:03: [dma 2] Jul 9 14:56:14.108398 kernel: pnp: PnP ACPI: found 5 devices Jul 9 14:56:14.108408 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 9 14:56:14.108418 kernel: NET: Registered PF_INET protocol family Jul 9 14:56:14.108431 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 9 14:56:14.108441 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jul 9 14:56:14.108450 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 9 14:56:14.108460 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 9 14:56:14.108469 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jul 9 14:56:14.108479 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jul 9 14:56:14.108488 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 9 14:56:14.108498 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 9 14:56:14.108508 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 9 14:56:14.108519 kernel: NET: Registered PF_XDP protocol family Jul 9 14:56:14.108606 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jul 9 14:56:14.108690 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jul 9 14:56:14.108796 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jul 9 14:56:14.108894 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Jul 9 14:56:14.108983 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Jul 9 14:56:14.109080 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Jul 9 14:56:14.109178 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jul 9 14:56:14.109197 kernel: PCI: CLS 0 bytes, default 64 Jul 9 14:56:14.109207 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jul 9 14:56:14.109217 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) Jul 9 14:56:14.109226 kernel: Initialise system trusted keyrings Jul 9 14:56:14.109236 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jul 9 14:56:14.109245 kernel: Key type asymmetric registered Jul 9 14:56:14.109255 kernel: Asymmetric key parser 'x509' registered Jul 9 14:56:14.109264 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 9 14:56:14.109274 kernel: io scheduler mq-deadline registered Jul 9 14:56:14.109285 kernel: io scheduler kyber registered Jul 9 14:56:14.109294 kernel: io scheduler bfq registered Jul 9 14:56:14.109304 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 9 14:56:14.109314 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Jul 9 14:56:14.109324 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Jul 9 14:56:14.109334 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Jul 9 14:56:14.109343 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Jul 9 14:56:14.109353 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 9 14:56:14.109362 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 9 14:56:14.109374 kernel: random: crng init done Jul 9 14:56:14.109383 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jul 9 14:56:14.109392 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 9 14:56:14.109402 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 9 14:56:14.109514 kernel: rtc_cmos 00:04: RTC can wake from S4 Jul 9 14:56:14.109531 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jul 9 14:56:14.109616 kernel: rtc_cmos 00:04: registered as rtc0 Jul 9 14:56:14.109703 kernel: rtc_cmos 00:04: setting system clock to 2025-07-09T14:56:13 UTC (1752072973) Jul 9 14:56:14.110395 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Jul 9 14:56:14.110410 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jul 9 14:56:14.110420 kernel: NET: Registered PF_INET6 protocol family Jul 9 14:56:14.110430 kernel: Segment Routing with IPv6 Jul 9 14:56:14.110439 kernel: In-situ OAM (IOAM) with IPv6 Jul 9 14:56:14.110448 kernel: NET: Registered PF_PACKET protocol family Jul 9 14:56:14.110458 kernel: Key type dns_resolver registered Jul 9 14:56:14.110467 kernel: IPI shorthand broadcast: enabled Jul 9 14:56:14.110477 kernel: sched_clock: Marking stable (4635007684, 188355369)->(4896368288, -73005235) Jul 9 14:56:14.110490 kernel: registered taskstats version 1 Jul 9 14:56:14.110499 kernel: Loading compiled-in X.509 certificates Jul 9 14:56:14.110509 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: 8ba3d283fde4a005aa35ab9394afe8122b8a3878' Jul 9 14:56:14.110518 kernel: Demotion targets for Node 0: null Jul 9 14:56:14.110528 kernel: Key type .fscrypt registered Jul 9 14:56:14.110537 kernel: Key type fscrypt-provisioning registered Jul 9 14:56:14.110547 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 9 14:56:14.110556 kernel: ima: Allocated hash algorithm: sha1 Jul 9 14:56:14.110568 kernel: ima: No architecture policies found Jul 9 14:56:14.110577 kernel: clk: Disabling unused clocks Jul 9 14:56:14.110587 kernel: Warning: unable to open an initial console. Jul 9 14:56:14.110596 kernel: Freeing unused kernel image (initmem) memory: 54568K Jul 9 14:56:14.110606 kernel: Write protecting the kernel read-only data: 24576k Jul 9 14:56:14.110615 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 9 14:56:14.110625 kernel: Run /init as init process Jul 9 14:56:14.110634 kernel: with arguments: Jul 9 14:56:14.110643 kernel: /init Jul 9 14:56:14.110654 kernel: with environment: Jul 9 14:56:14.110664 kernel: HOME=/ Jul 9 14:56:14.110673 kernel: TERM=linux Jul 9 14:56:14.110682 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 9 14:56:14.110699 systemd[1]: Successfully made /usr/ read-only. Jul 9 14:56:14.110738 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 9 14:56:14.110749 systemd[1]: Detected virtualization kvm. Jul 9 14:56:14.110763 systemd[1]: Detected architecture x86-64. Jul 9 14:56:14.110784 systemd[1]: Running in initrd. Jul 9 14:56:14.110795 systemd[1]: No hostname configured, using default hostname. Jul 9 14:56:14.110806 systemd[1]: Hostname set to . Jul 9 14:56:14.110817 systemd[1]: Initializing machine ID from VM UUID. Jul 9 14:56:14.110827 systemd[1]: Queued start job for default target initrd.target. Jul 9 14:56:14.110837 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 9 14:56:14.110850 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 9 14:56:14.110861 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 9 14:56:14.110872 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 9 14:56:14.110882 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 9 14:56:14.110894 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 9 14:56:14.110906 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 9 14:56:14.110918 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 9 14:56:14.110929 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 9 14:56:14.110939 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 9 14:56:14.110950 systemd[1]: Reached target paths.target - Path Units. Jul 9 14:56:14.110960 systemd[1]: Reached target slices.target - Slice Units. Jul 9 14:56:14.110970 systemd[1]: Reached target swap.target - Swaps. Jul 9 14:56:14.110980 systemd[1]: Reached target timers.target - Timer Units. Jul 9 14:56:14.110991 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 9 14:56:14.111001 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 9 14:56:14.111014 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 9 14:56:14.111024 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 9 14:56:14.111034 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 9 14:56:14.111045 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 9 14:56:14.111055 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 9 14:56:14.111065 systemd[1]: Reached target sockets.target - Socket Units. Jul 9 14:56:14.111076 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 9 14:56:14.111086 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 9 14:56:14.111098 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 9 14:56:14.111109 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 9 14:56:14.111120 systemd[1]: Starting systemd-fsck-usr.service... Jul 9 14:56:14.111132 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 9 14:56:14.111143 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 9 14:56:14.111155 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 14:56:14.111175 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 9 14:56:14.111211 systemd-journald[214]: Collecting audit messages is disabled. Jul 9 14:56:14.111241 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 9 14:56:14.111252 systemd[1]: Finished systemd-fsck-usr.service. Jul 9 14:56:14.111263 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 9 14:56:14.111274 systemd-journald[214]: Journal started Jul 9 14:56:14.111304 systemd-journald[214]: Runtime Journal (/run/log/journal/6a84316b74fa44c4b8ddca01e384150a) is 8M, max 78.5M, 70.5M free. Jul 9 14:56:14.069746 systemd-modules-load[216]: Inserted module 'overlay' Jul 9 14:56:14.113924 systemd[1]: Started systemd-journald.service - Journal Service. Jul 9 14:56:14.119843 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 9 14:56:14.124596 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 9 14:56:14.129846 systemd-modules-load[216]: Inserted module 'br_netfilter' Jul 9 14:56:14.173488 kernel: Bridge firewalling registered Jul 9 14:56:14.136315 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 9 14:56:14.146405 systemd-tmpfiles[228]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 9 14:56:14.175491 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 14:56:14.176259 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 9 14:56:14.178645 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 9 14:56:14.182153 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 9 14:56:14.197371 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 9 14:56:14.198506 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 9 14:56:14.212910 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 9 14:56:14.217982 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 9 14:56:14.222183 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 9 14:56:14.233855 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 9 14:56:14.237901 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 9 14:56:14.262565 dracut-cmdline[255]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=f85d3be94c634d7d72fbcd0e670073ce56ae2e0cc763f83b329300b7cea5203d Jul 9 14:56:14.269018 systemd-resolved[251]: Positive Trust Anchors: Jul 9 14:56:14.269033 systemd-resolved[251]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 9 14:56:14.269071 systemd-resolved[251]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 9 14:56:14.272647 systemd-resolved[251]: Defaulting to hostname 'linux'. Jul 9 14:56:14.274118 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 9 14:56:14.275337 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 9 14:56:14.371750 kernel: SCSI subsystem initialized Jul 9 14:56:14.386771 kernel: Loading iSCSI transport class v2.0-870. Jul 9 14:56:14.401742 kernel: iscsi: registered transport (tcp) Jul 9 14:56:14.427247 kernel: iscsi: registered transport (qla4xxx) Jul 9 14:56:14.427339 kernel: QLogic iSCSI HBA Driver Jul 9 14:56:14.455613 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 9 14:56:14.472291 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 9 14:56:14.477081 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 9 14:56:14.552743 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 9 14:56:14.555867 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 9 14:56:14.634808 kernel: raid6: sse2x4 gen() 12186 MB/s Jul 9 14:56:14.653781 kernel: raid6: sse2x2 gen() 11973 MB/s Jul 9 14:56:14.672188 kernel: raid6: sse2x1 gen() 7766 MB/s Jul 9 14:56:14.672254 kernel: raid6: using algorithm sse2x4 gen() 12186 MB/s Jul 9 14:56:14.691214 kernel: raid6: .... xor() 5627 MB/s, rmw enabled Jul 9 14:56:14.691335 kernel: raid6: using ssse3x2 recovery algorithm Jul 9 14:56:14.713804 kernel: xor: measuring software checksum speed Jul 9 14:56:14.715784 kernel: prefetch64-sse : 3536 MB/sec Jul 9 14:56:14.718762 kernel: generic_sse : 1848 MB/sec Jul 9 14:56:14.718822 kernel: xor: using function: prefetch64-sse (3536 MB/sec) Jul 9 14:56:14.952795 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 9 14:56:14.961137 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 9 14:56:14.964032 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 9 14:56:15.017209 systemd-udevd[463]: Using default interface naming scheme 'v255'. Jul 9 14:56:15.028394 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 9 14:56:15.037453 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 9 14:56:15.071666 dracut-pre-trigger[471]: rd.md=0: removing MD RAID activation Jul 9 14:56:15.116344 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 9 14:56:15.121936 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 9 14:56:15.239404 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 9 14:56:15.249421 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 9 14:56:15.335744 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Jul 9 14:56:15.370188 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jul 9 14:56:15.370679 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 9 14:56:15.382931 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Jul 9 14:56:15.370850 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 14:56:15.384884 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 14:56:15.397651 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 9 14:56:15.397781 kernel: GPT:17805311 != 20971519 Jul 9 14:56:15.397826 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 9 14:56:15.397839 kernel: GPT:17805311 != 20971519 Jul 9 14:56:15.397863 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 9 14:56:15.397878 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jul 9 14:56:15.388247 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 14:56:15.389472 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 9 14:56:15.401743 kernel: libata version 3.00 loaded. Jul 9 14:56:15.401772 kernel: ata_piix 0000:00:01.1: version 2.13 Jul 9 14:56:15.403747 kernel: scsi host0: ata_piix Jul 9 14:56:15.409128 kernel: scsi host1: ata_piix Jul 9 14:56:15.409773 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 lpm-pol 0 Jul 9 14:56:15.409789 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 lpm-pol 0 Jul 9 14:56:15.655230 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 14:56:15.666829 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 9 14:56:15.678376 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jul 9 14:56:15.689636 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jul 9 14:56:15.700318 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jul 9 14:56:15.708994 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jul 9 14:56:15.709614 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jul 9 14:56:15.712086 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 9 14:56:15.715211 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 9 14:56:15.717514 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 9 14:56:15.720736 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 9 14:56:15.725877 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 9 14:56:15.747879 disk-uuid[564]: Primary Header is updated. Jul 9 14:56:15.747879 disk-uuid[564]: Secondary Entries is updated. Jul 9 14:56:15.747879 disk-uuid[564]: Secondary Header is updated. Jul 9 14:56:15.763814 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jul 9 14:56:15.783398 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 9 14:56:16.791781 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jul 9 14:56:16.791898 disk-uuid[565]: The operation has completed successfully. Jul 9 14:56:16.892505 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 9 14:56:16.892750 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 9 14:56:16.938732 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 9 14:56:16.959819 sh[583]: Success Jul 9 14:56:16.995536 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 9 14:56:16.995590 kernel: device-mapper: uevent: version 1.0.3 Jul 9 14:56:16.997933 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 9 14:56:17.032776 kernel: device-mapper: verity: sha256 using shash "sha256-ssse3" Jul 9 14:56:17.177647 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 9 14:56:17.184424 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 9 14:56:17.201322 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 9 14:56:17.232769 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 9 14:56:17.240815 kernel: BTRFS: device fsid 082bcfbc-2c86-46fe-87f4-85dea5450235 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (595) Jul 9 14:56:17.244145 kernel: BTRFS info (device dm-0): first mount of filesystem 082bcfbc-2c86-46fe-87f4-85dea5450235 Jul 9 14:56:17.244259 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 9 14:56:17.245794 kernel: BTRFS info (device dm-0): using free-space-tree Jul 9 14:56:17.262105 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 9 14:56:17.264191 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 9 14:56:17.266340 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 9 14:56:17.268246 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 9 14:56:17.271081 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 9 14:56:17.309774 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (626) Jul 9 14:56:17.314198 kernel: BTRFS info (device vda6): first mount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 14:56:17.314266 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jul 9 14:56:17.315780 kernel: BTRFS info (device vda6): using free-space-tree Jul 9 14:56:17.328962 kernel: BTRFS info (device vda6): last unmount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 14:56:17.330325 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 9 14:56:17.335995 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 9 14:56:17.399326 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 9 14:56:17.402855 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 9 14:56:17.455129 systemd-networkd[765]: lo: Link UP Jul 9 14:56:17.455135 systemd-networkd[765]: lo: Gained carrier Jul 9 14:56:17.456572 systemd-networkd[765]: Enumeration completed Jul 9 14:56:17.457313 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 9 14:56:17.457318 systemd-networkd[765]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 9 14:56:17.530243 systemd-networkd[765]: eth0: Link UP Jul 9 14:56:17.530248 systemd-networkd[765]: eth0: Gained carrier Jul 9 14:56:17.530291 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 9 14:56:17.533782 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 9 14:56:17.536989 systemd[1]: Reached target network.target - Network. Jul 9 14:56:17.546739 systemd-networkd[765]: eth0: DHCPv4 address 172.24.4.222/24, gateway 172.24.4.1 acquired from 172.24.4.1 Jul 9 14:56:17.771164 ignition[677]: Ignition 2.21.0 Jul 9 14:56:17.771183 ignition[677]: Stage: fetch-offline Jul 9 14:56:17.773771 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 9 14:56:17.771243 ignition[677]: no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:17.775616 systemd-resolved[251]: Detected conflict on linux IN A 172.24.4.222 Jul 9 14:56:17.771255 ignition[677]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:17.775627 systemd-resolved[251]: Hostname conflict, changing published hostname from 'linux' to 'linux3'. Jul 9 14:56:17.771373 ignition[677]: parsed url from cmdline: "" Jul 9 14:56:17.775832 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 9 14:56:17.771377 ignition[677]: no config URL provided Jul 9 14:56:17.771383 ignition[677]: reading system config file "/usr/lib/ignition/user.ign" Jul 9 14:56:17.771392 ignition[677]: no config at "/usr/lib/ignition/user.ign" Jul 9 14:56:17.771397 ignition[677]: failed to fetch config: resource requires networking Jul 9 14:56:17.771632 ignition[677]: Ignition finished successfully Jul 9 14:56:17.799600 ignition[776]: Ignition 2.21.0 Jul 9 14:56:17.799615 ignition[776]: Stage: fetch Jul 9 14:56:17.799857 ignition[776]: no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:17.799870 ignition[776]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:17.800027 ignition[776]: parsed url from cmdline: "" Jul 9 14:56:17.800032 ignition[776]: no config URL provided Jul 9 14:56:17.800037 ignition[776]: reading system config file "/usr/lib/ignition/user.ign" Jul 9 14:56:17.800047 ignition[776]: no config at "/usr/lib/ignition/user.ign" Jul 9 14:56:17.800278 ignition[776]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jul 9 14:56:17.801279 ignition[776]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jul 9 14:56:17.801400 ignition[776]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jul 9 14:56:18.118201 ignition[776]: GET result: OK Jul 9 14:56:18.118376 ignition[776]: parsing config with SHA512: 415131c601a956e085e91f2a5bdb66d90ed76fe2b69e35758e1b38494f29d38ca6591c7cd08d6d425f453ff5bdf8b07df7b21c56b9ed165ab6ac4afde97d5744 Jul 9 14:56:18.129283 unknown[776]: fetched base config from "system" Jul 9 14:56:18.129293 unknown[776]: fetched base config from "system" Jul 9 14:56:18.130006 ignition[776]: fetch: fetch complete Jul 9 14:56:18.129300 unknown[776]: fetched user config from "openstack" Jul 9 14:56:18.130012 ignition[776]: fetch: fetch passed Jul 9 14:56:18.130073 ignition[776]: Ignition finished successfully Jul 9 14:56:18.134389 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 9 14:56:18.138322 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 9 14:56:18.231637 ignition[782]: Ignition 2.21.0 Jul 9 14:56:18.231653 ignition[782]: Stage: kargs Jul 9 14:56:18.231816 ignition[782]: no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:18.231827 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:18.236212 ignition[782]: kargs: kargs passed Jul 9 14:56:18.236263 ignition[782]: Ignition finished successfully Jul 9 14:56:18.240017 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 9 14:56:18.243281 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 9 14:56:18.394212 ignition[789]: Ignition 2.21.0 Jul 9 14:56:18.394801 ignition[789]: Stage: disks Jul 9 14:56:18.400703 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 9 14:56:18.396254 ignition[789]: no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:18.396271 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:18.403092 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 9 14:56:18.398325 ignition[789]: disks: disks passed Jul 9 14:56:18.405038 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 9 14:56:18.398403 ignition[789]: Ignition finished successfully Jul 9 14:56:18.407206 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 9 14:56:18.409351 systemd[1]: Reached target sysinit.target - System Initialization. Jul 9 14:56:18.410890 systemd[1]: Reached target basic.target - Basic System. Jul 9 14:56:18.415870 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 9 14:56:18.465350 systemd-fsck[797]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 9 14:56:18.481567 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 9 14:56:18.488641 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 9 14:56:18.707766 kernel: EXT4-fs (vda9): mounted filesystem b08a603c-44fa-43af-af80-90bed9b8770a r/w with ordered data mode. Quota mode: none. Jul 9 14:56:18.709431 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 9 14:56:18.712012 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 9 14:56:18.715979 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 9 14:56:18.738765 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 9 14:56:18.744017 systemd-networkd[765]: eth0: Gained IPv6LL Jul 9 14:56:18.747540 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jul 9 14:56:18.752452 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jul 9 14:56:18.758976 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 9 14:56:18.759072 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 9 14:56:18.772010 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 9 14:56:18.776035 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (805) Jul 9 14:56:18.778756 kernel: BTRFS info (device vda6): first mount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 14:56:18.778888 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jul 9 14:56:18.778937 kernel: BTRFS info (device vda6): using free-space-tree Jul 9 14:56:18.796846 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 9 14:56:18.811355 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 9 14:56:18.915915 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:18.937046 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Jul 9 14:56:18.945995 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Jul 9 14:56:18.950659 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Jul 9 14:56:18.954727 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Jul 9 14:56:19.089791 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 9 14:56:19.091400 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 9 14:56:19.096022 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 9 14:56:19.121137 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 9 14:56:19.128546 kernel: BTRFS info (device vda6): last unmount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 14:56:19.147273 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 9 14:56:19.178640 ignition[924]: INFO : Ignition 2.21.0 Jul 9 14:56:19.178640 ignition[924]: INFO : Stage: mount Jul 9 14:56:19.180062 ignition[924]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:19.180062 ignition[924]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:19.181516 ignition[924]: INFO : mount: mount passed Jul 9 14:56:19.181516 ignition[924]: INFO : Ignition finished successfully Jul 9 14:56:19.181457 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 9 14:56:19.956777 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:21.971795 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:25.984794 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:25.999890 coreos-metadata[807]: Jul 09 14:56:25.999 WARN failed to locate config-drive, using the metadata service API instead Jul 9 14:56:26.064924 coreos-metadata[807]: Jul 09 14:56:26.064 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jul 9 14:56:26.083869 coreos-metadata[807]: Jul 09 14:56:26.083 INFO Fetch successful Jul 9 14:56:26.085474 coreos-metadata[807]: Jul 09 14:56:26.084 INFO wrote hostname ci-9999-9-100-bf645a1a30.novalocal to /sysroot/etc/hostname Jul 9 14:56:26.087330 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jul 9 14:56:26.087525 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jul 9 14:56:26.091826 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 9 14:56:26.134214 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 9 14:56:26.182839 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (940) Jul 9 14:56:26.192208 kernel: BTRFS info (device vda6): first mount of filesystem 87056a6c-ee99-487a-9330-f1335025b841 Jul 9 14:56:26.192313 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jul 9 14:56:26.196452 kernel: BTRFS info (device vda6): using free-space-tree Jul 9 14:56:26.213813 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 9 14:56:26.354469 ignition[958]: INFO : Ignition 2.21.0 Jul 9 14:56:26.354469 ignition[958]: INFO : Stage: files Jul 9 14:56:26.357763 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:26.357763 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:26.357763 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Jul 9 14:56:26.363670 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 9 14:56:26.363670 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 9 14:56:26.363670 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 9 14:56:26.369888 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 9 14:56:26.369888 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 9 14:56:26.369888 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 9 14:56:26.369888 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jul 9 14:56:26.364378 unknown[958]: wrote ssh authorized keys file for user: core Jul 9 14:56:27.042956 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 9 14:56:33.025743 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 9 14:56:33.031641 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 9 14:56:33.049285 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 9 14:56:33.049285 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 9 14:56:33.049285 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 9 14:56:33.049285 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 9 14:56:33.049285 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 9 14:56:33.049285 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Jul 9 14:56:33.805642 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 9 14:56:35.691946 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 9 14:56:35.691946 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 9 14:56:35.695972 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 9 14:56:35.704333 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 9 14:56:35.704333 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 9 14:56:35.704333 ignition[958]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jul 9 14:56:35.708361 ignition[958]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jul 9 14:56:35.708361 ignition[958]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 9 14:56:35.708361 ignition[958]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 9 14:56:35.708361 ignition[958]: INFO : files: files passed Jul 9 14:56:35.708361 ignition[958]: INFO : Ignition finished successfully Jul 9 14:56:35.711427 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 9 14:56:35.719796 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 9 14:56:35.722935 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 9 14:56:35.754337 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 9 14:56:35.754547 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 9 14:56:35.767244 initrd-setup-root-after-ignition[987]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 9 14:56:35.767244 initrd-setup-root-after-ignition[987]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 9 14:56:35.770770 initrd-setup-root-after-ignition[991]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 9 14:56:35.770055 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 9 14:56:35.772250 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 9 14:56:35.775883 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 9 14:56:35.831652 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 9 14:56:35.831848 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 9 14:56:35.833635 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 9 14:56:35.834871 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 9 14:56:35.836392 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 9 14:56:35.837887 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 9 14:56:35.895002 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 9 14:56:35.901502 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 9 14:56:35.962674 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 9 14:56:35.966268 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 9 14:56:35.968286 systemd[1]: Stopped target timers.target - Timer Units. Jul 9 14:56:35.971287 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 9 14:56:35.971915 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 9 14:56:35.975161 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 9 14:56:35.977370 systemd[1]: Stopped target basic.target - Basic System. Jul 9 14:56:35.980562 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 9 14:56:35.983313 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 9 14:56:35.985938 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 9 14:56:35.989167 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 9 14:56:35.992350 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 9 14:56:35.995419 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 9 14:56:35.998392 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 9 14:56:36.001599 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 9 14:56:36.004773 systemd[1]: Stopped target swap.target - Swaps. Jul 9 14:56:36.007473 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 9 14:56:36.008212 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 9 14:56:36.011087 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 9 14:56:36.013389 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 9 14:56:36.015990 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 9 14:56:36.016380 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 9 14:56:36.019179 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 9 14:56:36.019908 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 9 14:56:36.023609 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 9 14:56:36.024244 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 9 14:56:36.027504 systemd[1]: ignition-files.service: Deactivated successfully. Jul 9 14:56:36.028053 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 9 14:56:36.035006 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 9 14:56:36.040188 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 9 14:56:36.043082 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 9 14:56:36.043458 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 9 14:56:36.045160 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 9 14:56:36.045513 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 9 14:56:36.051847 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 9 14:56:36.051971 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 9 14:56:36.080499 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 9 14:56:36.085157 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 9 14:56:36.086044 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 9 14:56:36.089012 ignition[1011]: INFO : Ignition 2.21.0 Jul 9 14:56:36.089012 ignition[1011]: INFO : Stage: umount Jul 9 14:56:36.090335 ignition[1011]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 9 14:56:36.090335 ignition[1011]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jul 9 14:56:36.092342 ignition[1011]: INFO : umount: umount passed Jul 9 14:56:36.092342 ignition[1011]: INFO : Ignition finished successfully Jul 9 14:56:36.092091 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 9 14:56:36.092202 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 9 14:56:36.093856 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 9 14:56:36.093961 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 9 14:56:36.094616 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 9 14:56:36.094675 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 9 14:56:36.095751 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 9 14:56:36.095813 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 9 14:56:36.096909 systemd[1]: Stopped target network.target - Network. Jul 9 14:56:36.097924 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 9 14:56:36.097986 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 9 14:56:36.099054 systemd[1]: Stopped target paths.target - Path Units. Jul 9 14:56:36.100086 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 9 14:56:36.101790 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 9 14:56:36.102437 systemd[1]: Stopped target slices.target - Slice Units. Jul 9 14:56:36.103550 systemd[1]: Stopped target sockets.target - Socket Units. Jul 9 14:56:36.104682 systemd[1]: iscsid.socket: Deactivated successfully. Jul 9 14:56:36.104769 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 9 14:56:36.105986 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 9 14:56:36.106023 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 9 14:56:36.107247 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 9 14:56:36.107312 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 9 14:56:36.108349 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 9 14:56:36.108428 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 9 14:56:36.109619 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 9 14:56:36.109687 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 9 14:56:36.111195 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 9 14:56:36.112814 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 9 14:56:36.121107 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 9 14:56:36.121253 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 9 14:56:36.129128 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 9 14:56:36.129427 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 9 14:56:36.129616 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 9 14:56:36.131439 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 9 14:56:36.132488 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 9 14:56:36.133809 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 9 14:56:36.133872 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 9 14:56:36.135990 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 9 14:56:36.138039 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 9 14:56:36.138104 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 9 14:56:36.140109 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 9 14:56:36.140172 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 9 14:56:36.142823 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 9 14:56:36.142890 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 9 14:56:36.144042 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 9 14:56:36.144106 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 9 14:56:36.145608 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 9 14:56:36.148050 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 9 14:56:36.148132 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 9 14:56:36.160436 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 9 14:56:36.162183 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 9 14:56:36.163263 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 9 14:56:36.163304 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 9 14:56:36.164621 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 9 14:56:36.164655 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 9 14:56:36.165900 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 9 14:56:36.165959 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 9 14:56:36.167625 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 9 14:56:36.167683 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 9 14:56:36.168932 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 9 14:56:36.168984 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 9 14:56:36.170931 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 9 14:56:36.172387 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 9 14:56:36.172464 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 9 14:56:36.176322 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 9 14:56:36.176373 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 9 14:56:36.177444 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 9 14:56:36.177487 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 9 14:56:36.178756 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 9 14:56:36.178812 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 9 14:56:36.181115 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 9 14:56:36.181200 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 14:56:36.184955 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 9 14:56:36.185011 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Jul 9 14:56:36.185049 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 9 14:56:36.185101 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 9 14:56:36.185461 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 9 14:56:36.185571 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 9 14:56:36.186451 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 9 14:56:36.186546 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 9 14:56:36.188415 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 9 14:56:36.190492 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 9 14:56:36.204631 systemd[1]: Switching root. Jul 9 14:56:36.257305 systemd-journald[214]: Journal stopped Jul 9 14:56:38.695178 systemd-journald[214]: Received SIGTERM from PID 1 (systemd). Jul 9 14:56:38.695364 kernel: SELinux: policy capability network_peer_controls=1 Jul 9 14:56:38.695410 kernel: SELinux: policy capability open_perms=1 Jul 9 14:56:38.695436 kernel: SELinux: policy capability extended_socket_class=1 Jul 9 14:56:38.695457 kernel: SELinux: policy capability always_check_network=0 Jul 9 14:56:38.695496 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 9 14:56:38.695516 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 9 14:56:38.695532 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 9 14:56:38.695550 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 9 14:56:38.695566 kernel: SELinux: policy capability userspace_initial_context=0 Jul 9 14:56:38.695584 kernel: audit: type=1403 audit(1752072997.105:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 9 14:56:38.695623 systemd[1]: Successfully loaded SELinux policy in 124.693ms. Jul 9 14:56:38.695676 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 28.299ms. Jul 9 14:56:38.695702 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 9 14:56:38.695914 systemd[1]: Detected virtualization kvm. Jul 9 14:56:38.695932 systemd[1]: Detected architecture x86-64. Jul 9 14:56:38.695952 systemd[1]: Detected first boot. Jul 9 14:56:38.695981 systemd[1]: Hostname set to . Jul 9 14:56:38.696011 systemd[1]: Initializing machine ID from VM UUID. Jul 9 14:56:38.696028 zram_generator::config[1055]: No configuration found. Jul 9 14:56:38.696052 kernel: Guest personality initialized and is inactive Jul 9 14:56:38.696083 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 9 14:56:38.696096 kernel: Initialized host personality Jul 9 14:56:38.696113 kernel: NET: Registered PF_VSOCK protocol family Jul 9 14:56:38.696142 systemd[1]: Populated /etc with preset unit settings. Jul 9 14:56:38.696165 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 9 14:56:38.696179 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 9 14:56:38.696198 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 9 14:56:38.696212 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 9 14:56:38.696234 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 9 14:56:38.696254 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 9 14:56:38.696268 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 9 14:56:38.696285 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 9 14:56:38.696301 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 9 14:56:38.696314 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 9 14:56:38.696328 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 9 14:56:38.696341 systemd[1]: Created slice user.slice - User and Session Slice. Jul 9 14:56:38.696364 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 9 14:56:38.696388 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 9 14:56:38.696401 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 9 14:56:38.696415 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 9 14:56:38.696428 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 9 14:56:38.696442 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 9 14:56:38.696455 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 9 14:56:38.696482 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 9 14:56:38.696508 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 9 14:56:38.696521 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 9 14:56:38.696544 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 9 14:56:38.696561 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 9 14:56:38.696589 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 9 14:56:38.696612 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 9 14:56:38.696625 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 9 14:56:38.696639 systemd[1]: Reached target slices.target - Slice Units. Jul 9 14:56:38.696662 systemd[1]: Reached target swap.target - Swaps. Jul 9 14:56:38.696678 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 9 14:56:38.696691 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 9 14:56:38.696723 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 9 14:56:38.696739 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 9 14:56:38.696760 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 9 14:56:38.696774 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 9 14:56:38.696791 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 9 14:56:38.696804 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 9 14:56:38.696817 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 9 14:56:38.696840 systemd[1]: Mounting media.mount - External Media Directory... Jul 9 14:56:38.696854 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 14:56:38.696867 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 9 14:56:38.696880 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 9 14:56:38.696893 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 9 14:56:38.696907 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 9 14:56:38.696920 systemd[1]: Reached target machines.target - Containers. Jul 9 14:56:38.696933 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 9 14:56:38.696998 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 9 14:56:38.697014 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 9 14:56:38.697036 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 9 14:56:38.697053 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 9 14:56:38.697066 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 9 14:56:38.697079 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 9 14:56:38.697097 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 9 14:56:38.697110 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 9 14:56:38.697124 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 9 14:56:38.697146 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 9 14:56:38.697159 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 9 14:56:38.697173 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 9 14:56:38.697195 systemd[1]: Stopped systemd-fsck-usr.service. Jul 9 14:56:38.697213 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 14:56:38.697240 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 9 14:56:38.697261 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 9 14:56:38.697275 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 9 14:56:38.697297 kernel: fuse: init (API version 7.41) Jul 9 14:56:38.697319 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 9 14:56:38.697333 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 9 14:56:38.697347 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 9 14:56:38.697368 systemd[1]: verity-setup.service: Deactivated successfully. Jul 9 14:56:38.697382 systemd[1]: Stopped verity-setup.service. Jul 9 14:56:38.697396 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 14:56:38.697410 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 9 14:56:38.697423 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 9 14:56:38.697437 systemd[1]: Mounted media.mount - External Media Directory. Jul 9 14:56:38.697461 kernel: ACPI: bus type drm_connector registered Jul 9 14:56:38.697484 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 9 14:56:38.697497 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 9 14:56:38.697510 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 9 14:56:38.697524 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 9 14:56:38.697543 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 9 14:56:38.697557 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 9 14:56:38.697570 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 9 14:56:38.697583 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 9 14:56:38.697609 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 9 14:56:38.697623 kernel: loop: module loaded Jul 9 14:56:38.697648 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 9 14:56:38.697661 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 9 14:56:38.697675 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 9 14:56:38.697688 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 9 14:56:38.697702 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 9 14:56:38.697736 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 9 14:56:38.697751 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 9 14:56:38.697779 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 9 14:56:38.697793 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 9 14:56:38.697813 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 9 14:56:38.697838 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 9 14:56:38.697884 systemd-journald[1142]: Collecting audit messages is disabled. Jul 9 14:56:38.697950 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 9 14:56:38.697966 systemd-journald[1142]: Journal started Jul 9 14:56:38.698002 systemd-journald[1142]: Runtime Journal (/run/log/journal/6a84316b74fa44c4b8ddca01e384150a) is 8M, max 78.5M, 70.5M free. Jul 9 14:56:38.162826 systemd[1]: Queued start job for default target multi-user.target. Jul 9 14:56:38.184978 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jul 9 14:56:38.185585 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 9 14:56:38.701754 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 9 14:56:38.709725 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 9 14:56:38.713734 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 9 14:56:38.717749 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 9 14:56:38.723593 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 9 14:56:38.725775 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 9 14:56:38.730741 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 14:56:38.740803 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 9 14:56:38.740863 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 9 14:56:38.750351 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 9 14:56:38.755749 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 9 14:56:38.760754 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 9 14:56:38.768532 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 9 14:56:38.772787 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 9 14:56:38.778742 systemd[1]: Started systemd-journald.service - Journal Service. Jul 9 14:56:38.780168 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 9 14:56:38.781176 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 9 14:56:38.781982 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 9 14:56:38.782767 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 9 14:56:38.783901 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 9 14:56:38.806747 kernel: loop0: detected capacity change from 0 to 8 Jul 9 14:56:38.809103 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 9 14:56:38.816897 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 9 14:56:38.824812 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 9 14:56:38.827190 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 9 14:56:38.845811 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 9 14:56:38.862012 systemd-journald[1142]: Time spent on flushing to /var/log/journal/6a84316b74fa44c4b8ddca01e384150a is 32.092ms for 983 entries. Jul 9 14:56:38.862012 systemd-journald[1142]: System Journal (/var/log/journal/6a84316b74fa44c4b8ddca01e384150a) is 8M, max 584.8M, 576.8M free. Jul 9 14:56:38.932647 systemd-journald[1142]: Received client request to flush runtime journal. Jul 9 14:56:38.932690 kernel: loop1: detected capacity change from 0 to 114008 Jul 9 14:56:38.867802 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 9 14:56:38.868832 systemd-tmpfiles[1175]: ACLs are not supported, ignoring. Jul 9 14:56:38.868847 systemd-tmpfiles[1175]: ACLs are not supported, ignoring. Jul 9 14:56:38.883345 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 9 14:56:38.888019 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 9 14:56:38.937762 kernel: loop2: detected capacity change from 0 to 146480 Jul 9 14:56:38.938988 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 9 14:56:38.990105 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 9 14:56:38.994161 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 9 14:56:39.008761 kernel: loop3: detected capacity change from 0 to 221472 Jul 9 14:56:39.052019 systemd-tmpfiles[1215]: ACLs are not supported, ignoring. Jul 9 14:56:39.053698 systemd-tmpfiles[1215]: ACLs are not supported, ignoring. Jul 9 14:56:39.069733 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 9 14:56:39.098478 kernel: loop4: detected capacity change from 0 to 8 Jul 9 14:56:39.120804 kernel: loop5: detected capacity change from 0 to 114008 Jul 9 14:56:39.191393 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 9 14:56:39.240765 kernel: loop6: detected capacity change from 0 to 146480 Jul 9 14:56:39.319758 kernel: loop7: detected capacity change from 0 to 221472 Jul 9 14:56:39.448843 (sd-merge)[1219]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Jul 9 14:56:39.450163 (sd-merge)[1219]: Merged extensions into '/usr'. Jul 9 14:56:39.458975 systemd[1]: Reload requested from client PID 1174 ('systemd-sysext') (unit systemd-sysext.service)... Jul 9 14:56:39.459203 systemd[1]: Reloading... Jul 9 14:56:39.730476 zram_generator::config[1245]: No configuration found. Jul 9 14:56:39.947077 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 14:56:40.083364 systemd[1]: Reloading finished in 622 ms. Jul 9 14:56:40.107367 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 9 14:56:40.111460 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 9 14:56:40.131178 systemd[1]: Starting ensure-sysext.service... Jul 9 14:56:40.137008 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 9 14:56:40.143198 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 9 14:56:40.146758 ldconfig[1170]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 9 14:56:40.173855 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 9 14:56:40.175950 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 9 14:56:40.175993 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 9 14:56:40.176292 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 9 14:56:40.177415 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 9 14:56:40.178414 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 9 14:56:40.179037 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Jul 9 14:56:40.179216 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Jul 9 14:56:40.183599 systemd[1]: Reload requested from client PID 1301 ('systemctl') (unit ensure-sysext.service)... Jul 9 14:56:40.183656 systemd[1]: Reloading... Jul 9 14:56:40.184629 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Jul 9 14:56:40.184639 systemd-tmpfiles[1302]: Skipping /boot Jul 9 14:56:40.220059 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Jul 9 14:56:40.220073 systemd-tmpfiles[1302]: Skipping /boot Jul 9 14:56:40.255822 systemd-udevd[1303]: Using default interface naming scheme 'v255'. Jul 9 14:56:40.282779 zram_generator::config[1334]: No configuration found. Jul 9 14:56:40.497342 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 14:56:40.569739 kernel: mousedev: PS/2 mouse device common for all mice Jul 9 14:56:40.629738 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jul 9 14:56:40.629861 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 9 14:56:40.629966 systemd[1]: Reloading finished in 445 ms. Jul 9 14:56:40.639526 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 9 14:56:40.641480 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 9 14:56:40.677736 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Jul 9 14:56:40.682786 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jul 9 14:56:40.694699 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 14:56:40.698140 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 9 14:56:40.704748 kernel: ACPI: button: Power Button [PWRF] Jul 9 14:56:40.715119 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 9 14:56:40.716950 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 9 14:56:40.718821 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 9 14:56:40.723001 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 9 14:56:40.727327 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 9 14:56:40.728957 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 14:56:40.729106 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 14:56:40.733040 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 9 14:56:40.741217 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 9 14:56:40.749826 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 9 14:56:40.757087 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 9 14:56:40.757736 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 14:56:40.761855 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 9 14:56:40.762083 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 9 14:56:40.763119 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 9 14:56:40.763773 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 9 14:56:40.791538 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 14:56:40.791863 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 9 14:56:40.796037 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 9 14:56:40.799314 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 9 14:56:40.806146 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 9 14:56:40.807425 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 9 14:56:40.807592 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 9 14:56:40.814823 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 9 14:56:40.818504 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 9 14:56:40.828609 augenrules[1471]: No rules Jul 9 14:56:40.831151 systemd[1]: Finished ensure-sysext.service. Jul 9 14:56:40.833490 systemd[1]: audit-rules.service: Deactivated successfully. Jul 9 14:56:40.835793 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 9 14:56:40.836678 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 9 14:56:40.838046 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 9 14:56:40.840088 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 9 14:56:40.840931 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 9 14:56:40.842516 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 9 14:56:40.843342 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 9 14:56:40.860848 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Jul 9 14:56:40.860915 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Jul 9 14:56:40.864785 kernel: Console: switching to colour dummy device 80x25 Jul 9 14:56:40.867328 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 9 14:56:40.867374 kernel: [drm] features: -context_init Jul 9 14:56:40.866654 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 9 14:56:40.869922 kernel: [drm] number of scanouts: 1 Jul 9 14:56:40.869961 kernel: [drm] number of cap sets: 0 Jul 9 14:56:40.874019 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 9 14:56:40.878161 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 9 14:56:40.886100 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 14:56:40.887254 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 9 14:56:40.887512 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 9 14:56:40.889968 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 9 14:56:40.912126 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 9 14:56:40.916069 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 9 14:56:40.938575 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 9 14:56:40.964751 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0 Jul 9 14:56:40.981408 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jul 9 14:56:40.982141 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 9 14:56:40.987955 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 9 14:56:40.988094 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 9 14:56:41.021816 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 9 14:56:41.032234 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 9 14:56:41.070103 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 9 14:56:41.070930 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 14:56:41.073987 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 9 14:56:41.080270 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 9 14:56:41.126734 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 9 14:56:41.193817 systemd-networkd[1444]: lo: Link UP Jul 9 14:56:41.194183 systemd-networkd[1444]: lo: Gained carrier Jul 9 14:56:41.195745 systemd-networkd[1444]: Enumeration completed Jul 9 14:56:41.195938 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 9 14:56:41.196371 systemd-networkd[1444]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 9 14:56:41.196380 systemd-networkd[1444]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 9 14:56:41.197235 systemd-networkd[1444]: eth0: Link UP Jul 9 14:56:41.197466 systemd-networkd[1444]: eth0: Gained carrier Jul 9 14:56:41.197548 systemd-networkd[1444]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 9 14:56:41.200957 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 9 14:56:41.203937 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 9 14:56:41.210685 systemd-resolved[1445]: Positive Trust Anchors: Jul 9 14:56:41.210704 systemd-resolved[1445]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 9 14:56:41.210808 systemd-networkd[1444]: eth0: DHCPv4 address 172.24.4.222/24, gateway 172.24.4.1 acquired from 172.24.4.1 Jul 9 14:56:41.211595 systemd-resolved[1445]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 9 14:56:41.217048 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 9 14:56:41.217214 systemd[1]: Reached target time-set.target - System Time Set. Jul 9 14:56:41.224571 systemd-resolved[1445]: Using system hostname 'ci-9999-9-100-bf645a1a30.novalocal'. Jul 9 14:56:41.230085 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 9 14:56:41.230284 systemd[1]: Reached target network.target - Network. Jul 9 14:56:41.230361 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 9 14:56:41.230438 systemd[1]: Reached target sysinit.target - System Initialization. Jul 9 14:56:41.230589 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 9 14:56:41.230751 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 9 14:56:41.230849 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 9 14:56:41.231091 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 9 14:56:41.231284 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 9 14:56:41.231367 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 9 14:56:41.231471 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 9 14:56:41.231518 systemd[1]: Reached target paths.target - Path Units. Jul 9 14:56:41.231602 systemd[1]: Reached target timers.target - Timer Units. Jul 9 14:56:41.233182 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 9 14:56:41.234603 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 9 14:56:41.237493 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 9 14:56:41.237767 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 9 14:56:41.237860 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 9 14:56:41.244579 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 9 14:56:41.245550 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 9 14:56:41.246562 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 9 14:56:41.246818 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 9 14:56:41.248385 systemd[1]: Reached target sockets.target - Socket Units. Jul 9 14:56:41.248532 systemd[1]: Reached target basic.target - Basic System. Jul 9 14:56:41.248666 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 9 14:56:41.248694 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 9 14:56:41.250254 systemd[1]: Starting containerd.service - containerd container runtime... Jul 9 14:56:41.253843 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 9 14:56:41.257322 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 9 14:56:41.261981 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 9 14:56:41.263399 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 9 14:56:41.263739 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:41.268056 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 9 14:56:41.268216 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 9 14:56:41.272021 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 9 14:56:41.274298 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 9 14:56:41.279216 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 9 14:56:41.281925 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 9 14:56:41.289955 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 9 14:56:41.295049 jq[1526]: false Jul 9 14:56:41.300670 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 9 14:56:41.301749 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 9 14:56:41.302316 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 9 14:56:41.304548 systemd[1]: Starting update-engine.service - Update Engine... Jul 9 14:56:41.308757 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Refreshing passwd entry cache Jul 9 14:56:41.307848 oslogin_cache_refresh[1528]: Refreshing passwd entry cache Jul 9 14:56:41.309356 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 9 14:56:41.317379 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 9 14:56:41.317737 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 9 14:56:41.317968 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 9 14:56:41.321219 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 9 14:56:41.321480 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 9 14:56:41.325072 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Failure getting users, quitting Jul 9 14:56:41.325072 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 9 14:56:41.325072 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Refreshing group entry cache Jul 9 14:56:41.324099 oslogin_cache_refresh[1528]: Failure getting users, quitting Jul 9 14:56:41.324128 oslogin_cache_refresh[1528]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 9 14:56:41.324188 oslogin_cache_refresh[1528]: Refreshing group entry cache Jul 9 14:56:41.760746 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Failure getting groups, quitting Jul 9 14:56:41.760746 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 9 14:56:41.760127 systemd-timesyncd[1482]: Contacted time server 143.42.229.154:123 (0.flatcar.pool.ntp.org). Jul 9 14:56:41.760021 oslogin_cache_refresh[1528]: Failure getting groups, quitting Jul 9 14:56:41.760231 systemd-timesyncd[1482]: Initial clock synchronization to Wed 2025-07-09 14:56:41.759822 UTC. Jul 9 14:56:41.760035 oslogin_cache_refresh[1528]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 9 14:56:41.764098 systemd-resolved[1445]: Clock change detected. Flushing caches. Jul 9 14:56:41.765864 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 9 14:56:41.766132 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 9 14:56:41.769856 extend-filesystems[1527]: Found /dev/vda6 Jul 9 14:56:41.778918 jq[1535]: true Jul 9 14:56:41.785492 extend-filesystems[1527]: Found /dev/vda9 Jul 9 14:56:41.799577 extend-filesystems[1527]: Checking size of /dev/vda9 Jul 9 14:56:41.807160 update_engine[1534]: I20250709 14:56:41.807080 1534 main.cc:92] Flatcar Update Engine starting Jul 9 14:56:41.834352 systemd[1]: motdgen.service: Deactivated successfully. Jul 9 14:56:41.836801 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 9 14:56:41.837074 (ntainerd)[1558]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 9 14:56:41.841823 tar[1538]: linux-amd64/helm Jul 9 14:56:41.873582 jq[1560]: true Jul 9 14:56:41.878483 extend-filesystems[1527]: Resized partition /dev/vda9 Jul 9 14:56:41.882507 extend-filesystems[1571]: resize2fs 1.47.2 (1-Jan-2025) Jul 9 14:56:41.895946 dbus-daemon[1522]: [system] SELinux support is enabled Jul 9 14:56:41.898744 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Jul 9 14:56:41.896313 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 9 14:56:41.900555 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 9 14:56:41.900593 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 9 14:56:41.900703 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 9 14:56:41.900722 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 9 14:56:41.910165 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Jul 9 14:56:41.920261 systemd[1]: Started update-engine.service - Update Engine. Jul 9 14:56:41.921785 update_engine[1534]: I20250709 14:56:41.920420 1534 update_check_scheduler.cc:74] Next update check in 6m30s Jul 9 14:56:41.969268 extend-filesystems[1571]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jul 9 14:56:41.969268 extend-filesystems[1571]: old_desc_blocks = 1, new_desc_blocks = 1 Jul 9 14:56:41.969268 extend-filesystems[1571]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Jul 9 14:56:41.927856 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 9 14:56:41.969610 extend-filesystems[1527]: Resized filesystem in /dev/vda9 Jul 9 14:56:41.958682 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 9 14:56:41.963685 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 9 14:56:41.996283 systemd-logind[1533]: New seat seat0. Jul 9 14:56:42.001183 systemd-logind[1533]: Watching system buttons on /dev/input/event2 (Power Button) Jul 9 14:56:42.001209 systemd-logind[1533]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 9 14:56:42.001428 systemd[1]: Started systemd-logind.service - User Login Management. Jul 9 14:56:42.076010 bash[1588]: Updated "/home/core/.ssh/authorized_keys" Jul 9 14:56:42.076325 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 9 14:56:42.088392 systemd[1]: Starting sshkeys.service... Jul 9 14:56:42.197558 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 9 14:56:42.201421 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 9 14:56:42.202824 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 9 14:56:42.222026 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:42.395804 locksmithd[1572]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 9 14:56:42.614370 sshd_keygen[1561]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 9 14:56:42.641779 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 9 14:56:42.645802 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 9 14:56:42.647994 systemd[1]: Started sshd@0-172.24.4.222:22-172.24.4.1:36598.service - OpenSSH per-connection server daemon (172.24.4.1:36598). Jul 9 14:56:42.697819 systemd[1]: issuegen.service: Deactivated successfully. Jul 9 14:56:42.698067 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 9 14:56:42.703535 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 9 14:56:42.710499 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:42.750283 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 9 14:56:42.753668 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 9 14:56:42.755739 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 9 14:56:42.756124 systemd[1]: Reached target getty.target - Login Prompts. Jul 9 14:56:42.828739 containerd[1558]: time="2025-07-09T14:56:42Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 9 14:56:42.829823 containerd[1558]: time="2025-07-09T14:56:42.829579637Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 9 14:56:42.846335 containerd[1558]: time="2025-07-09T14:56:42.846272625Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.987µs" Jul 9 14:56:42.846335 containerd[1558]: time="2025-07-09T14:56:42.846306017Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 9 14:56:42.846335 containerd[1558]: time="2025-07-09T14:56:42.846328750Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 9 14:56:42.846717 containerd[1558]: time="2025-07-09T14:56:42.846570053Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 9 14:56:42.846717 containerd[1558]: time="2025-07-09T14:56:42.846590010Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 9 14:56:42.846717 containerd[1558]: time="2025-07-09T14:56:42.846629965Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 9 14:56:42.846851 containerd[1558]: time="2025-07-09T14:56:42.846743297Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 9 14:56:42.846851 containerd[1558]: time="2025-07-09T14:56:42.846763285Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847110 containerd[1558]: time="2025-07-09T14:56:42.847056014Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847110 containerd[1558]: time="2025-07-09T14:56:42.847103483Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847195 containerd[1558]: time="2025-07-09T14:56:42.847118661Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847195 containerd[1558]: time="2025-07-09T14:56:42.847129031Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847253 containerd[1558]: time="2025-07-09T14:56:42.847240390Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847744 containerd[1558]: time="2025-07-09T14:56:42.847681587Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847744 containerd[1558]: time="2025-07-09T14:56:42.847725590Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 9 14:56:42.847744 containerd[1558]: time="2025-07-09T14:56:42.847738654Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 9 14:56:42.847839 containerd[1558]: time="2025-07-09T14:56:42.847812843Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 9 14:56:42.851808 containerd[1558]: time="2025-07-09T14:56:42.851694323Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 9 14:56:42.851881 containerd[1558]: time="2025-07-09T14:56:42.851802185Z" level=info msg="metadata content store policy set" policy=shared Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.864594257Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.864807567Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.864879302Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.864899109Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.864914487Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865241711Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865264193Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865315640Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865335146Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865347840Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865359101Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 9 14:56:42.865445 containerd[1558]: time="2025-07-09T14:56:42.865396982Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865595094Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865648244Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865667590Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865683941Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865695973Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865707695Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 9 14:56:42.865788 containerd[1558]: time="2025-07-09T14:56:42.865720189Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 9 14:56:42.865964 containerd[1558]: time="2025-07-09T14:56:42.865937957Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 9 14:56:42.866000 containerd[1558]: time="2025-07-09T14:56:42.865963234Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 9 14:56:42.866027 containerd[1558]: time="2025-07-09T14:56:42.866001576Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 9 14:56:42.866052 containerd[1558]: time="2025-07-09T14:56:42.866028557Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 9 14:56:42.866183 containerd[1558]: time="2025-07-09T14:56:42.866157950Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 9 14:56:42.866222 containerd[1558]: time="2025-07-09T14:56:42.866187876Z" level=info msg="Start snapshots syncer" Jul 9 14:56:42.866367 containerd[1558]: time="2025-07-09T14:56:42.866333789Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 9 14:56:42.867217 containerd[1558]: time="2025-07-09T14:56:42.867132588Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 9 14:56:42.867433 containerd[1558]: time="2025-07-09T14:56:42.867230531Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 9 14:56:42.867706 containerd[1558]: time="2025-07-09T14:56:42.867663854Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 9 14:56:42.868026 containerd[1558]: time="2025-07-09T14:56:42.867996909Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 9 14:56:42.868066 containerd[1558]: time="2025-07-09T14:56:42.868034128Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 9 14:56:42.868066 containerd[1558]: time="2025-07-09T14:56:42.868047524Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 9 14:56:42.868066 containerd[1558]: time="2025-07-09T14:56:42.868059586Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 9 14:56:42.868154 containerd[1558]: time="2025-07-09T14:56:42.868072440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 9 14:56:42.868154 containerd[1558]: time="2025-07-09T14:56:42.868085264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 9 14:56:42.868154 containerd[1558]: time="2025-07-09T14:56:42.868102086Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 9 14:56:42.868154 containerd[1558]: time="2025-07-09T14:56:42.868131661Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 9 14:56:42.868154 containerd[1558]: time="2025-07-09T14:56:42.868145377Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 9 14:56:42.868260 containerd[1558]: time="2025-07-09T14:56:42.868158001Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 9 14:56:42.868487 containerd[1558]: time="2025-07-09T14:56:42.868300809Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 9 14:56:42.868487 containerd[1558]: time="2025-07-09T14:56:42.868326978Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 9 14:56:42.868487 containerd[1558]: time="2025-07-09T14:56:42.868341234Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 9 14:56:42.868487 containerd[1558]: time="2025-07-09T14:56:42.868422507Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 9 14:56:42.868487 containerd[1558]: time="2025-07-09T14:56:42.868436032Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 9 14:56:42.868613 containerd[1558]: time="2025-07-09T14:56:42.868448636Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 9 14:56:42.868613 containerd[1558]: time="2025-07-09T14:56:42.868586835Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 9 14:56:42.868613 containerd[1558]: time="2025-07-09T14:56:42.868610549Z" level=info msg="runtime interface created" Jul 9 14:56:42.868685 containerd[1558]: time="2025-07-09T14:56:42.868616861Z" level=info msg="created NRI interface" Jul 9 14:56:42.868685 containerd[1558]: time="2025-07-09T14:56:42.868625888Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 9 14:56:42.868685 containerd[1558]: time="2025-07-09T14:56:42.868639484Z" level=info msg="Connect containerd service" Jul 9 14:56:42.868685 containerd[1558]: time="2025-07-09T14:56:42.868666895Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 9 14:56:42.870750 containerd[1558]: time="2025-07-09T14:56:42.870710267Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 9 14:56:43.085838 tar[1538]: linux-amd64/LICENSE Jul 9 14:56:43.086238 tar[1538]: linux-amd64/README.md Jul 9 14:56:43.100901 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 9 14:56:43.380506 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:43.422889 systemd-networkd[1444]: eth0: Gained IPv6LL Jul 9 14:56:43.431538 containerd[1558]: time="2025-07-09T14:56:43.431322075Z" level=info msg="Start subscribing containerd event" Jul 9 14:56:43.431830 containerd[1558]: time="2025-07-09T14:56:43.431621697Z" level=info msg="Start recovering state" Jul 9 14:56:43.432112 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 9 14:56:43.433388 containerd[1558]: time="2025-07-09T14:56:43.433329961Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 9 14:56:43.433690 containerd[1558]: time="2025-07-09T14:56:43.433659709Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 9 14:56:43.434967 systemd[1]: Reached target network-online.target - Network is Online. Jul 9 14:56:43.435706 containerd[1558]: time="2025-07-09T14:56:43.435651725Z" level=info msg="Start event monitor" Jul 9 14:56:43.435777 containerd[1558]: time="2025-07-09T14:56:43.435731875Z" level=info msg="Start cni network conf syncer for default" Jul 9 14:56:43.435777 containerd[1558]: time="2025-07-09T14:56:43.435760790Z" level=info msg="Start streaming server" Jul 9 14:56:43.435857 containerd[1558]: time="2025-07-09T14:56:43.435790075Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 9 14:56:43.435857 containerd[1558]: time="2025-07-09T14:56:43.435807768Z" level=info msg="runtime interface starting up..." Jul 9 14:56:43.435857 containerd[1558]: time="2025-07-09T14:56:43.435830290Z" level=info msg="starting plugins..." Jul 9 14:56:43.435984 containerd[1558]: time="2025-07-09T14:56:43.435860036Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 9 14:56:43.436205 containerd[1558]: time="2025-07-09T14:56:43.436159998Z" level=info msg="containerd successfully booted in 0.608195s" Jul 9 14:56:43.444618 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:56:43.450552 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 9 14:56:43.451169 systemd[1]: Started containerd.service - containerd container runtime. Jul 9 14:56:43.478895 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 9 14:56:44.206709 sshd[1614]: Accepted publickey for core from 172.24.4.1 port 36598 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:56:44.212065 sshd-session[1614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:56:44.235126 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 9 14:56:44.241111 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 9 14:56:44.265691 systemd-logind[1533]: New session 1 of user core. Jul 9 14:56:44.293446 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 9 14:56:44.297002 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 9 14:56:44.309994 (systemd)[1657]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 9 14:56:44.313840 systemd-logind[1533]: New session c1 of user core. Jul 9 14:56:44.564628 systemd[1657]: Queued start job for default target default.target. Jul 9 14:56:44.569097 systemd[1657]: Created slice app.slice - User Application Slice. Jul 9 14:56:44.569248 systemd[1657]: Reached target paths.target - Paths. Jul 9 14:56:44.569382 systemd[1657]: Reached target timers.target - Timers. Jul 9 14:56:44.574561 systemd[1657]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 9 14:56:44.592712 systemd[1657]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 9 14:56:44.593817 systemd[1657]: Reached target sockets.target - Sockets. Jul 9 14:56:44.593966 systemd[1657]: Reached target basic.target - Basic System. Jul 9 14:56:44.594178 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 9 14:56:44.594303 systemd[1657]: Reached target default.target - Main User Target. Jul 9 14:56:44.594426 systemd[1657]: Startup finished in 271ms. Jul 9 14:56:44.602648 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 9 14:56:44.752564 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:44.979980 systemd[1]: Started sshd@1-172.24.4.222:22-172.24.4.1:58710.service - OpenSSH per-connection server daemon (172.24.4.1:58710). Jul 9 14:56:45.415531 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:46.034745 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:56:46.051108 (kubelet)[1678]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 14:56:46.269009 sshd[1669]: Accepted publickey for core from 172.24.4.1 port 58710 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:56:46.273345 sshd-session[1669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:56:46.287096 systemd-logind[1533]: New session 2 of user core. Jul 9 14:56:46.297982 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 9 14:56:46.896510 sshd[1684]: Connection closed by 172.24.4.1 port 58710 Jul 9 14:56:46.897520 sshd-session[1669]: pam_unix(sshd:session): session closed for user core Jul 9 14:56:46.921531 systemd[1]: sshd@1-172.24.4.222:22-172.24.4.1:58710.service: Deactivated successfully. Jul 9 14:56:46.924129 systemd[1]: session-2.scope: Deactivated successfully. Jul 9 14:56:46.925694 systemd-logind[1533]: Session 2 logged out. Waiting for processes to exit. Jul 9 14:56:46.929774 systemd[1]: Started sshd@2-172.24.4.222:22-172.24.4.1:58720.service - OpenSSH per-connection server daemon (172.24.4.1:58720). Jul 9 14:56:46.931782 systemd-logind[1533]: Removed session 2. Jul 9 14:56:47.841858 login[1622]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 9 14:56:47.854150 login[1623]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jul 9 14:56:47.854189 systemd-logind[1533]: New session 3 of user core. Jul 9 14:56:47.860385 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 9 14:56:47.868721 systemd-logind[1533]: New session 4 of user core. Jul 9 14:56:47.874013 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 9 14:56:48.079388 kubelet[1678]: E0709 14:56:48.079260 1678 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 14:56:48.083911 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 14:56:48.084272 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 14:56:48.085808 systemd[1]: kubelet.service: Consumed 2.734s CPU time, 264.9M memory peak. Jul 9 14:56:48.496804 sshd[1690]: Accepted publickey for core from 172.24.4.1 port 58720 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:56:48.497899 sshd-session[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:56:48.510102 systemd-logind[1533]: New session 5 of user core. Jul 9 14:56:48.533164 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 9 14:56:48.785528 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:48.800308 coreos-metadata[1521]: Jul 09 14:56:48.800 WARN failed to locate config-drive, using the metadata service API instead Jul 9 14:56:48.851329 coreos-metadata[1521]: Jul 09 14:56:48.851 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jul 9 14:56:49.107631 sshd[1725]: Connection closed by 172.24.4.1 port 58720 Jul 9 14:56:49.107852 sshd-session[1690]: pam_unix(sshd:session): session closed for user core Jul 9 14:56:49.117113 systemd[1]: sshd@2-172.24.4.222:22-172.24.4.1:58720.service: Deactivated successfully. Jul 9 14:56:49.121889 systemd[1]: session-5.scope: Deactivated successfully. Jul 9 14:56:49.124789 systemd-logind[1533]: Session 5 logged out. Waiting for processes to exit. Jul 9 14:56:49.127603 coreos-metadata[1521]: Jul 09 14:56:49.127 INFO Fetch successful Jul 9 14:56:49.128538 coreos-metadata[1521]: Jul 09 14:56:49.128 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jul 9 14:56:49.129646 systemd-logind[1533]: Removed session 5. Jul 9 14:56:49.140774 coreos-metadata[1521]: Jul 09 14:56:49.140 INFO Fetch successful Jul 9 14:56:49.141126 coreos-metadata[1521]: Jul 09 14:56:49.141 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jul 9 14:56:49.153241 coreos-metadata[1521]: Jul 09 14:56:49.153 INFO Fetch successful Jul 9 14:56:49.153591 coreos-metadata[1521]: Jul 09 14:56:49.153 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jul 9 14:56:49.166329 coreos-metadata[1521]: Jul 09 14:56:49.166 INFO Fetch successful Jul 9 14:56:49.166671 coreos-metadata[1521]: Jul 09 14:56:49.166 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jul 9 14:56:49.178278 coreos-metadata[1521]: Jul 09 14:56:49.178 INFO Fetch successful Jul 9 14:56:49.178278 coreos-metadata[1521]: Jul 09 14:56:49.178 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jul 9 14:56:49.190052 coreos-metadata[1521]: Jul 09 14:56:49.189 INFO Fetch successful Jul 9 14:56:49.245840 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 9 14:56:49.247988 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 9 14:56:49.460548 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Jul 9 14:56:49.478065 coreos-metadata[1594]: Jul 09 14:56:49.477 WARN failed to locate config-drive, using the metadata service API instead Jul 9 14:56:49.520393 coreos-metadata[1594]: Jul 09 14:56:49.520 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jul 9 14:56:49.535774 coreos-metadata[1594]: Jul 09 14:56:49.535 INFO Fetch successful Jul 9 14:56:49.535774 coreos-metadata[1594]: Jul 09 14:56:49.535 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jul 9 14:56:49.547520 coreos-metadata[1594]: Jul 09 14:56:49.547 INFO Fetch successful Jul 9 14:56:49.553774 unknown[1594]: wrote ssh authorized keys file for user: core Jul 9 14:56:49.606680 update-ssh-keys[1740]: Updated "/home/core/.ssh/authorized_keys" Jul 9 14:56:49.609821 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 9 14:56:49.613373 systemd[1]: Finished sshkeys.service. Jul 9 14:56:49.619569 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 9 14:56:49.619995 systemd[1]: Startup finished in 4.874s (kernel) + 23.226s (initrd) + 12.211s (userspace) = 40.312s. Jul 9 14:56:58.335836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 9 14:56:58.339720 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:56:58.790920 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:56:58.812037 (kubelet)[1751]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 14:56:58.930683 kubelet[1751]: E0709 14:56:58.930599 1751 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 14:56:58.936962 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 14:56:58.937232 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 14:56:58.938038 systemd[1]: kubelet.service: Consumed 426ms CPU time, 108.8M memory peak. Jul 9 14:56:59.131671 systemd[1]: Started sshd@3-172.24.4.222:22-172.24.4.1:56886.service - OpenSSH per-connection server daemon (172.24.4.1:56886). Jul 9 14:57:00.216727 sshd[1759]: Accepted publickey for core from 172.24.4.1 port 56886 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:57:00.219740 sshd-session[1759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:57:00.232681 systemd-logind[1533]: New session 6 of user core. Jul 9 14:57:00.249916 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 9 14:57:00.696693 sshd[1762]: Connection closed by 172.24.4.1 port 56886 Jul 9 14:57:00.697888 sshd-session[1759]: pam_unix(sshd:session): session closed for user core Jul 9 14:57:00.717103 systemd[1]: sshd@3-172.24.4.222:22-172.24.4.1:56886.service: Deactivated successfully. Jul 9 14:57:00.721019 systemd[1]: session-6.scope: Deactivated successfully. Jul 9 14:57:00.724328 systemd-logind[1533]: Session 6 logged out. Waiting for processes to exit. Jul 9 14:57:00.729954 systemd[1]: Started sshd@4-172.24.4.222:22-172.24.4.1:56898.service - OpenSSH per-connection server daemon (172.24.4.1:56898). Jul 9 14:57:00.732621 systemd-logind[1533]: Removed session 6. Jul 9 14:57:01.811358 sshd[1768]: Accepted publickey for core from 172.24.4.1 port 56898 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:57:01.814353 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:57:01.827575 systemd-logind[1533]: New session 7 of user core. Jul 9 14:57:01.835817 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 9 14:57:02.315163 sshd[1771]: Connection closed by 172.24.4.1 port 56898 Jul 9 14:57:02.316218 sshd-session[1768]: pam_unix(sshd:session): session closed for user core Jul 9 14:57:02.331989 systemd[1]: sshd@4-172.24.4.222:22-172.24.4.1:56898.service: Deactivated successfully. Jul 9 14:57:02.336332 systemd[1]: session-7.scope: Deactivated successfully. Jul 9 14:57:02.338865 systemd-logind[1533]: Session 7 logged out. Waiting for processes to exit. Jul 9 14:57:02.344665 systemd[1]: Started sshd@5-172.24.4.222:22-172.24.4.1:56914.service - OpenSSH per-connection server daemon (172.24.4.1:56914). Jul 9 14:57:02.347574 systemd-logind[1533]: Removed session 7. Jul 9 14:57:03.517917 sshd[1777]: Accepted publickey for core from 172.24.4.1 port 56914 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:57:03.536237 sshd-session[1777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:57:03.548572 systemd-logind[1533]: New session 8 of user core. Jul 9 14:57:03.556813 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 9 14:57:04.251430 sshd[1780]: Connection closed by 172.24.4.1 port 56914 Jul 9 14:57:04.252661 sshd-session[1777]: pam_unix(sshd:session): session closed for user core Jul 9 14:57:04.272145 systemd[1]: sshd@5-172.24.4.222:22-172.24.4.1:56914.service: Deactivated successfully. Jul 9 14:57:04.277120 systemd[1]: session-8.scope: Deactivated successfully. Jul 9 14:57:04.279226 systemd-logind[1533]: Session 8 logged out. Waiting for processes to exit. Jul 9 14:57:04.286642 systemd[1]: Started sshd@6-172.24.4.222:22-172.24.4.1:39984.service - OpenSSH per-connection server daemon (172.24.4.1:39984). Jul 9 14:57:04.288808 systemd-logind[1533]: Removed session 8. Jul 9 14:57:05.420184 sshd[1786]: Accepted publickey for core from 172.24.4.1 port 39984 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:57:05.423176 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:57:05.435570 systemd-logind[1533]: New session 9 of user core. Jul 9 14:57:05.447853 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 9 14:57:05.760649 sudo[1790]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 9 14:57:05.761343 sudo[1790]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 14:57:05.784354 sudo[1790]: pam_unix(sudo:session): session closed for user root Jul 9 14:57:05.940501 sshd[1789]: Connection closed by 172.24.4.1 port 39984 Jul 9 14:57:05.941995 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Jul 9 14:57:05.954992 systemd[1]: sshd@6-172.24.4.222:22-172.24.4.1:39984.service: Deactivated successfully. Jul 9 14:57:05.959042 systemd[1]: session-9.scope: Deactivated successfully. Jul 9 14:57:05.962897 systemd-logind[1533]: Session 9 logged out. Waiting for processes to exit. Jul 9 14:57:05.968818 systemd[1]: Started sshd@7-172.24.4.222:22-172.24.4.1:39994.service - OpenSSH per-connection server daemon (172.24.4.1:39994). Jul 9 14:57:05.971215 systemd-logind[1533]: Removed session 9. Jul 9 14:57:07.478517 sshd[1796]: Accepted publickey for core from 172.24.4.1 port 39994 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:57:07.482035 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:57:07.495843 systemd-logind[1533]: New session 10 of user core. Jul 9 14:57:07.518959 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 9 14:57:07.845958 sudo[1801]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 9 14:57:07.847674 sudo[1801]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 14:57:07.863690 sudo[1801]: pam_unix(sudo:session): session closed for user root Jul 9 14:57:07.881732 sudo[1800]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 9 14:57:07.882604 sudo[1800]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 14:57:07.951813 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 9 14:57:08.029872 augenrules[1823]: No rules Jul 9 14:57:08.031900 systemd[1]: audit-rules.service: Deactivated successfully. Jul 9 14:57:08.032393 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 9 14:57:08.035134 sudo[1800]: pam_unix(sudo:session): session closed for user root Jul 9 14:57:08.221516 sshd[1799]: Connection closed by 172.24.4.1 port 39994 Jul 9 14:57:08.222589 sshd-session[1796]: pam_unix(sshd:session): session closed for user core Jul 9 14:57:08.241660 systemd[1]: sshd@7-172.24.4.222:22-172.24.4.1:39994.service: Deactivated successfully. Jul 9 14:57:08.245536 systemd[1]: session-10.scope: Deactivated successfully. Jul 9 14:57:08.249219 systemd-logind[1533]: Session 10 logged out. Waiting for processes to exit. Jul 9 14:57:08.254413 systemd[1]: Started sshd@8-172.24.4.222:22-172.24.4.1:39996.service - OpenSSH per-connection server daemon (172.24.4.1:39996). Jul 9 14:57:08.258106 systemd-logind[1533]: Removed session 10. Jul 9 14:57:09.102028 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 9 14:57:09.107186 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:09.500741 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:09.514724 (kubelet)[1843]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 14:57:09.620085 sshd[1832]: Accepted publickey for core from 172.24.4.1 port 39996 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 14:57:09.624999 sshd-session[1832]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 14:57:09.639573 systemd-logind[1533]: New session 11 of user core. Jul 9 14:57:09.648141 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 9 14:57:09.735929 kubelet[1843]: E0709 14:57:09.735779 1843 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 14:57:09.741145 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 14:57:09.741582 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 14:57:09.742874 systemd[1]: kubelet.service: Consumed 370ms CPU time, 108.3M memory peak. Jul 9 14:57:10.236879 sudo[1851]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 9 14:57:10.237598 sudo[1851]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 9 14:57:11.627312 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 9 14:57:11.646433 (dockerd)[1869]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 9 14:57:12.505325 dockerd[1869]: time="2025-07-09T14:57:12.505079379Z" level=info msg="Starting up" Jul 9 14:57:12.508185 dockerd[1869]: time="2025-07-09T14:57:12.506050771Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 9 14:57:12.542062 dockerd[1869]: time="2025-07-09T14:57:12.541966531Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 9 14:57:12.621906 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport386323185-merged.mount: Deactivated successfully. Jul 9 14:57:12.681975 dockerd[1869]: time="2025-07-09T14:57:12.681738213Z" level=info msg="Loading containers: start." Jul 9 14:57:12.709494 kernel: Initializing XFRM netlink socket Jul 9 14:57:13.094305 systemd-networkd[1444]: docker0: Link UP Jul 9 14:57:13.103595 dockerd[1869]: time="2025-07-09T14:57:13.103352901Z" level=info msg="Loading containers: done." Jul 9 14:57:13.129272 dockerd[1869]: time="2025-07-09T14:57:13.129186632Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 9 14:57:13.129564 dockerd[1869]: time="2025-07-09T14:57:13.129309813Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 9 14:57:13.129564 dockerd[1869]: time="2025-07-09T14:57:13.129392358Z" level=info msg="Initializing buildkit" Jul 9 14:57:13.131175 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3497599947-merged.mount: Deactivated successfully. Jul 9 14:57:13.177552 dockerd[1869]: time="2025-07-09T14:57:13.177481559Z" level=info msg="Completed buildkit initialization" Jul 9 14:57:13.194674 dockerd[1869]: time="2025-07-09T14:57:13.194574317Z" level=info msg="Daemon has completed initialization" Jul 9 14:57:13.196260 dockerd[1869]: time="2025-07-09T14:57:13.194731211Z" level=info msg="API listen on /run/docker.sock" Jul 9 14:57:13.195412 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 9 14:57:14.990721 containerd[1558]: time="2025-07-09T14:57:14.990249780Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\"" Jul 9 14:57:15.867242 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount765452005.mount: Deactivated successfully. Jul 9 14:57:18.049667 containerd[1558]: time="2025-07-09T14:57:18.049595317Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:18.050923 containerd[1558]: time="2025-07-09T14:57:18.050872940Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.10: active requests=0, bytes read=28077752" Jul 9 14:57:18.052103 containerd[1558]: time="2025-07-09T14:57:18.052050234Z" level=info msg="ImageCreate event name:\"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:18.056572 containerd[1558]: time="2025-07-09T14:57:18.056524818Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:18.057773 containerd[1558]: time="2025-07-09T14:57:18.057742427Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.10\" with image id \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\", size \"28074544\" in 3.067007394s" Jul 9 14:57:18.057890 containerd[1558]: time="2025-07-09T14:57:18.057872011Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\" returns image reference \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\"" Jul 9 14:57:18.058487 containerd[1558]: time="2025-07-09T14:57:18.058422556Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\"" Jul 9 14:57:19.850917 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 9 14:57:19.854505 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:20.199097 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:20.235826 (kubelet)[2145]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 14:57:20.361660 kubelet[2145]: E0709 14:57:20.361521 2145 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 14:57:20.364103 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 14:57:20.364253 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 14:57:20.365283 systemd[1]: kubelet.service: Consumed 402ms CPU time, 108.4M memory peak. Jul 9 14:57:20.749257 containerd[1558]: time="2025-07-09T14:57:20.749078510Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:20.752591 containerd[1558]: time="2025-07-09T14:57:20.752278715Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.10: active requests=0, bytes read=24713302" Jul 9 14:57:20.754653 containerd[1558]: time="2025-07-09T14:57:20.754415482Z" level=info msg="ImageCreate event name:\"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:20.761508 containerd[1558]: time="2025-07-09T14:57:20.761322256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:20.764364 containerd[1558]: time="2025-07-09T14:57:20.764281207Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.10\" with image id \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\", size \"26315128\" in 2.705815301s" Jul 9 14:57:20.764784 containerd[1558]: time="2025-07-09T14:57:20.764605047Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\" returns image reference \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\"" Jul 9 14:57:20.768209 containerd[1558]: time="2025-07-09T14:57:20.767897095Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\"" Jul 9 14:57:23.686700 containerd[1558]: time="2025-07-09T14:57:23.686547742Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:23.688003 containerd[1558]: time="2025-07-09T14:57:23.687968493Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.10: active requests=0, bytes read=18783679" Jul 9 14:57:23.689671 containerd[1558]: time="2025-07-09T14:57:23.689627861Z" level=info msg="ImageCreate event name:\"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:23.694567 containerd[1558]: time="2025-07-09T14:57:23.694513902Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:23.696467 containerd[1558]: time="2025-07-09T14:57:23.696349110Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.10\" with image id \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\", size \"20385523\" in 2.928394597s" Jul 9 14:57:23.696467 containerd[1558]: time="2025-07-09T14:57:23.696408843Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\" returns image reference \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\"" Jul 9 14:57:23.697888 containerd[1558]: time="2025-07-09T14:57:23.697852626Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\"" Jul 9 14:57:25.398714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3844861583.mount: Deactivated successfully. Jul 9 14:57:26.232840 containerd[1558]: time="2025-07-09T14:57:26.232528377Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:26.238625 containerd[1558]: time="2025-07-09T14:57:26.236036998Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.10: active requests=0, bytes read=30383951" Jul 9 14:57:26.240511 containerd[1558]: time="2025-07-09T14:57:26.239840392Z" level=info msg="ImageCreate event name:\"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:26.244340 containerd[1558]: time="2025-07-09T14:57:26.244166479Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:26.245278 containerd[1558]: time="2025-07-09T14:57:26.244761015Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.10\" with image id \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\", repo tag \"registry.k8s.io/kube-proxy:v1.31.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\", size \"30382962\" in 2.546755573s" Jul 9 14:57:26.245278 containerd[1558]: time="2025-07-09T14:57:26.244884578Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\" returns image reference \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\"" Jul 9 14:57:26.249416 containerd[1558]: time="2025-07-09T14:57:26.249370725Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 9 14:57:27.212444 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3979641618.mount: Deactivated successfully. Jul 9 14:57:27.323706 update_engine[1534]: I20250709 14:57:27.323096 1534 update_attempter.cc:509] Updating boot flags... Jul 9 14:57:28.942110 containerd[1558]: time="2025-07-09T14:57:28.942030872Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:28.947700 containerd[1558]: time="2025-07-09T14:57:28.947643713Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Jul 9 14:57:28.950803 containerd[1558]: time="2025-07-09T14:57:28.950715913Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:28.954517 containerd[1558]: time="2025-07-09T14:57:28.954478369Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:28.956199 containerd[1558]: time="2025-07-09T14:57:28.956068285Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.706658127s" Jul 9 14:57:28.956199 containerd[1558]: time="2025-07-09T14:57:28.956116826Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 9 14:57:28.957633 containerd[1558]: time="2025-07-09T14:57:28.957608818Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 9 14:57:29.564105 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3975815829.mount: Deactivated successfully. Jul 9 14:57:29.577175 containerd[1558]: time="2025-07-09T14:57:29.577032131Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 9 14:57:29.579109 containerd[1558]: time="2025-07-09T14:57:29.579011949Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Jul 9 14:57:29.581703 containerd[1558]: time="2025-07-09T14:57:29.581576385Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 9 14:57:29.589610 containerd[1558]: time="2025-07-09T14:57:29.589447616Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 9 14:57:29.592603 containerd[1558]: time="2025-07-09T14:57:29.591313951Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 633.664306ms" Jul 9 14:57:29.592603 containerd[1558]: time="2025-07-09T14:57:29.591389002Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 9 14:57:29.593621 containerd[1558]: time="2025-07-09T14:57:29.593536886Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 9 14:57:30.602379 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 9 14:57:30.614326 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:30.724384 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2795364331.mount: Deactivated successfully. Jul 9 14:57:31.884733 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:31.896650 (kubelet)[2257]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 9 14:57:32.034533 kubelet[2257]: E0709 14:57:32.034293 2257 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 9 14:57:32.038060 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 9 14:57:32.038260 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 9 14:57:32.039519 systemd[1]: kubelet.service: Consumed 628ms CPU time, 109.4M memory peak. Jul 9 14:57:35.665644 containerd[1558]: time="2025-07-09T14:57:35.664909994Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:35.674540 containerd[1558]: time="2025-07-09T14:57:35.666537248Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" Jul 9 14:57:35.674540 containerd[1558]: time="2025-07-09T14:57:35.672070776Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:35.679554 containerd[1558]: time="2025-07-09T14:57:35.679436813Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:35.680936 containerd[1558]: time="2025-07-09T14:57:35.680902104Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 6.087298432s" Jul 9 14:57:35.681117 containerd[1558]: time="2025-07-09T14:57:35.681081871Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jul 9 14:57:39.643263 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:39.644127 systemd[1]: kubelet.service: Consumed 628ms CPU time, 109.4M memory peak. Jul 9 14:57:39.659720 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:39.725606 systemd[1]: Reload requested from client PID 2333 ('systemctl') (unit session-11.scope)... Jul 9 14:57:39.725654 systemd[1]: Reloading... Jul 9 14:57:39.906228 zram_generator::config[2377]: No configuration found. Jul 9 14:57:40.307725 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 14:57:40.467794 systemd[1]: Reloading finished in 741 ms. Jul 9 14:57:40.872525 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 9 14:57:40.872764 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 9 14:57:40.873992 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:40.874106 systemd[1]: kubelet.service: Consumed 394ms CPU time, 97.2M memory peak. Jul 9 14:57:40.886265 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:41.500589 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:41.512235 (kubelet)[2442]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 9 14:57:41.702650 kubelet[2442]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 14:57:41.703483 kubelet[2442]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 9 14:57:41.703483 kubelet[2442]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 14:57:41.703483 kubelet[2442]: I0709 14:57:41.703239 2442 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 9 14:57:42.225468 kubelet[2442]: I0709 14:57:42.225385 2442 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 9 14:57:42.225620 kubelet[2442]: I0709 14:57:42.225449 2442 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 9 14:57:42.226117 kubelet[2442]: I0709 14:57:42.226076 2442 server.go:934] "Client rotation is on, will bootstrap in background" Jul 9 14:57:42.281366 kubelet[2442]: I0709 14:57:42.281278 2442 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 9 14:57:42.284753 kubelet[2442]: E0709 14:57:42.284671 2442 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.222:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:42.313167 kubelet[2442]: I0709 14:57:42.313112 2442 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 9 14:57:42.329829 kubelet[2442]: I0709 14:57:42.328964 2442 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 9 14:57:42.331520 kubelet[2442]: I0709 14:57:42.331482 2442 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 9 14:57:42.332041 kubelet[2442]: I0709 14:57:42.331959 2442 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 9 14:57:42.332690 kubelet[2442]: I0709 14:57:42.332186 2442 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-9-100-bf645a1a30.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 9 14:57:42.333319 kubelet[2442]: I0709 14:57:42.333291 2442 topology_manager.go:138] "Creating topology manager with none policy" Jul 9 14:57:42.333449 kubelet[2442]: I0709 14:57:42.333432 2442 container_manager_linux.go:300] "Creating device plugin manager" Jul 9 14:57:42.333915 kubelet[2442]: I0709 14:57:42.333887 2442 state_mem.go:36] "Initialized new in-memory state store" Jul 9 14:57:42.343277 kubelet[2442]: I0709 14:57:42.343242 2442 kubelet.go:408] "Attempting to sync node with API server" Jul 9 14:57:42.343506 kubelet[2442]: I0709 14:57:42.343481 2442 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 9 14:57:42.343705 kubelet[2442]: I0709 14:57:42.343683 2442 kubelet.go:314] "Adding apiserver pod source" Jul 9 14:57:42.343940 kubelet[2442]: I0709 14:57:42.343913 2442 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 9 14:57:42.346605 kubelet[2442]: W0709 14:57:42.345811 2442 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.222:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-9-100-bf645a1a30.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.222:6443: connect: connection refused Jul 9 14:57:42.346605 kubelet[2442]: E0709 14:57:42.345897 2442 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.222:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-9-100-bf645a1a30.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:42.352890 kubelet[2442]: W0709 14:57:42.352775 2442 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.222:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.222:6443: connect: connection refused Jul 9 14:57:42.352890 kubelet[2442]: E0709 14:57:42.352828 2442 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.222:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:42.354443 kubelet[2442]: I0709 14:57:42.353738 2442 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 9 14:57:42.354443 kubelet[2442]: I0709 14:57:42.354237 2442 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 9 14:57:42.355319 kubelet[2442]: W0709 14:57:42.355302 2442 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 9 14:57:42.358317 kubelet[2442]: I0709 14:57:42.358298 2442 server.go:1274] "Started kubelet" Jul 9 14:57:42.360214 kubelet[2442]: I0709 14:57:42.360196 2442 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 9 14:57:42.370995 kubelet[2442]: I0709 14:57:42.370950 2442 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 9 14:57:42.374604 kubelet[2442]: I0709 14:57:42.373831 2442 server.go:449] "Adding debug handlers to kubelet server" Jul 9 14:57:42.374911 kubelet[2442]: E0709 14:57:42.372566 2442 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.222:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.222:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-9-100-bf645a1a30.novalocal.18509d2ad6650049 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-9-100-bf645a1a30.novalocal,UID:ci-9999-9-100-bf645a1a30.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-9-100-bf645a1a30.novalocal,},FirstTimestamp:2025-07-09 14:57:42.358253641 +0000 UTC m=+0.758105368,LastTimestamp:2025-07-09 14:57:42.358253641 +0000 UTC m=+0.758105368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-9-100-bf645a1a30.novalocal,}" Jul 9 14:57:42.377489 kubelet[2442]: I0709 14:57:42.377435 2442 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 9 14:57:42.378233 kubelet[2442]: I0709 14:57:42.378172 2442 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 9 14:57:42.378615 kubelet[2442]: I0709 14:57:42.378597 2442 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 9 14:57:42.379660 kubelet[2442]: I0709 14:57:42.379630 2442 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 9 14:57:42.380117 kubelet[2442]: E0709 14:57:42.380078 2442 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-bf645a1a30.novalocal\" not found" Jul 9 14:57:42.382569 kubelet[2442]: E0709 14:57:42.380844 2442 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-bf645a1a30.novalocal?timeout=10s\": dial tcp 172.24.4.222:6443: connect: connection refused" interval="200ms" Jul 9 14:57:42.385862 kubelet[2442]: I0709 14:57:42.385826 2442 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 9 14:57:42.386018 kubelet[2442]: I0709 14:57:42.385998 2442 factory.go:221] Registration of the systemd container factory successfully Jul 9 14:57:42.386190 kubelet[2442]: I0709 14:57:42.386170 2442 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 9 14:57:42.387013 kubelet[2442]: I0709 14:57:42.386008 2442 reconciler.go:26] "Reconciler: start to sync state" Jul 9 14:57:42.388437 kubelet[2442]: W0709 14:57:42.388396 2442 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.222:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.222:6443: connect: connection refused Jul 9 14:57:42.388603 kubelet[2442]: E0709 14:57:42.388581 2442 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.222:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:42.388977 kubelet[2442]: E0709 14:57:42.388956 2442 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 9 14:57:42.390011 kubelet[2442]: I0709 14:57:42.389992 2442 factory.go:221] Registration of the containerd container factory successfully Jul 9 14:57:42.422362 kubelet[2442]: I0709 14:57:42.422242 2442 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 9 14:57:42.425866 kubelet[2442]: I0709 14:57:42.425818 2442 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 9 14:57:42.425957 kubelet[2442]: I0709 14:57:42.425926 2442 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 9 14:57:42.426015 kubelet[2442]: I0709 14:57:42.425995 2442 kubelet.go:2321] "Starting kubelet main sync loop" Jul 9 14:57:42.426117 kubelet[2442]: E0709 14:57:42.426079 2442 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 9 14:57:42.432516 kubelet[2442]: W0709 14:57:42.432237 2442 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.222:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.222:6443: connect: connection refused Jul 9 14:57:42.432516 kubelet[2442]: E0709 14:57:42.432496 2442 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.222:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:42.436644 kubelet[2442]: I0709 14:57:42.436620 2442 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 9 14:57:42.437534 kubelet[2442]: I0709 14:57:42.437364 2442 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 9 14:57:42.437534 kubelet[2442]: I0709 14:57:42.437404 2442 state_mem.go:36] "Initialized new in-memory state store" Jul 9 14:57:42.444635 kubelet[2442]: I0709 14:57:42.444593 2442 policy_none.go:49] "None policy: Start" Jul 9 14:57:42.445784 kubelet[2442]: I0709 14:57:42.445718 2442 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 9 14:57:42.445870 kubelet[2442]: I0709 14:57:42.445796 2442 state_mem.go:35] "Initializing new in-memory state store" Jul 9 14:57:42.460680 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 9 14:57:42.476752 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 9 14:57:42.480984 kubelet[2442]: E0709 14:57:42.480825 2442 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-9999-9-100-bf645a1a30.novalocal\" not found" Jul 9 14:57:42.485766 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 9 14:57:42.497915 kubelet[2442]: I0709 14:57:42.497860 2442 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 9 14:57:42.498393 kubelet[2442]: I0709 14:57:42.498347 2442 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 9 14:57:42.498563 kubelet[2442]: I0709 14:57:42.498405 2442 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 9 14:57:42.499189 kubelet[2442]: I0709 14:57:42.499144 2442 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 9 14:57:42.503670 kubelet[2442]: E0709 14:57:42.503643 2442 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-9-100-bf645a1a30.novalocal\" not found" Jul 9 14:57:42.544671 systemd[1]: Created slice kubepods-burstable-pod422dcf2db8ddfe2950af52e03de96b6e.slice - libcontainer container kubepods-burstable-pod422dcf2db8ddfe2950af52e03de96b6e.slice. Jul 9 14:57:42.562843 systemd[1]: Created slice kubepods-burstable-pod630bf591e71c48107e78e5a1a76f69ec.slice - libcontainer container kubepods-burstable-pod630bf591e71c48107e78e5a1a76f69ec.slice. Jul 9 14:57:42.581969 kubelet[2442]: E0709 14:57:42.581927 2442 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-bf645a1a30.novalocal?timeout=10s\": dial tcp 172.24.4.222:6443: connect: connection refused" interval="400ms" Jul 9 14:57:42.582625 systemd[1]: Created slice kubepods-burstable-pod32c56ef1f36dd2118846a7ee71fd2749.slice - libcontainer container kubepods-burstable-pod32c56ef1f36dd2118846a7ee71fd2749.slice. Jul 9 14:57:42.588117 kubelet[2442]: I0709 14:57:42.587992 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.588322 kubelet[2442]: I0709 14:57:42.588247 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.588322 kubelet[2442]: I0709 14:57:42.588293 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-k8s-certs\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.588605 kubelet[2442]: I0709 14:57:42.588446 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-kubeconfig\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.588605 kubelet[2442]: I0709 14:57:42.588561 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/32c56ef1f36dd2118846a7ee71fd2749-kubeconfig\") pod \"kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"32c56ef1f36dd2118846a7ee71fd2749\") " pod="kube-system/kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.588827 kubelet[2442]: I0709 14:57:42.588583 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/422dcf2db8ddfe2950af52e03de96b6e-ca-certs\") pod \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"422dcf2db8ddfe2950af52e03de96b6e\") " pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.588827 kubelet[2442]: I0709 14:57:42.588784 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/422dcf2db8ddfe2950af52e03de96b6e-k8s-certs\") pod \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"422dcf2db8ddfe2950af52e03de96b6e\") " pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.589017 kubelet[2442]: I0709 14:57:42.588943 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/422dcf2db8ddfe2950af52e03de96b6e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"422dcf2db8ddfe2950af52e03de96b6e\") " pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.589017 kubelet[2442]: I0709 14:57:42.588984 2442 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-ca-certs\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.601271 kubelet[2442]: I0709 14:57:42.600818 2442 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.601528 kubelet[2442]: E0709 14:57:42.601502 2442 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.222:6443/api/v1/nodes\": dial tcp 172.24.4.222:6443: connect: connection refused" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.806224 kubelet[2442]: I0709 14:57:42.806008 2442 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.808856 kubelet[2442]: E0709 14:57:42.807103 2442 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.222:6443/api/v1/nodes\": dial tcp 172.24.4.222:6443: connect: connection refused" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:42.863780 containerd[1558]: time="2025-07-09T14:57:42.863206347Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal,Uid:422dcf2db8ddfe2950af52e03de96b6e,Namespace:kube-system,Attempt:0,}" Jul 9 14:57:42.879586 containerd[1558]: time="2025-07-09T14:57:42.878794750Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal,Uid:630bf591e71c48107e78e5a1a76f69ec,Namespace:kube-system,Attempt:0,}" Jul 9 14:57:42.890173 containerd[1558]: time="2025-07-09T14:57:42.890063266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal,Uid:32c56ef1f36dd2118846a7ee71fd2749,Namespace:kube-system,Attempt:0,}" Jul 9 14:57:42.983486 kubelet[2442]: E0709 14:57:42.983379 2442 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-bf645a1a30.novalocal?timeout=10s\": dial tcp 172.24.4.222:6443: connect: connection refused" interval="800ms" Jul 9 14:57:43.052589 containerd[1558]: time="2025-07-09T14:57:43.052538354Z" level=info msg="connecting to shim 3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2" address="unix:///run/containerd/s/dd5b7db94f8ac57f021c3e35bfd25014c8538d733033a12e7024f445b07e41a5" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:57:43.061838 containerd[1558]: time="2025-07-09T14:57:43.061574670Z" level=info msg="connecting to shim e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead" address="unix:///run/containerd/s/6dcc86629619ae9c68f0b9972b137ba0897f120b3d1d0192184fc1f2c28250cd" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:57:43.083492 containerd[1558]: time="2025-07-09T14:57:43.083118527Z" level=info msg="connecting to shim 96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39" address="unix:///run/containerd/s/373b54ba6f381b4533cf616f65bd7adc560faa2f543308a3604164117a1a93d9" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:57:43.113765 systemd[1]: Started cri-containerd-3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2.scope - libcontainer container 3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2. Jul 9 14:57:43.164610 systemd[1]: Started cri-containerd-96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39.scope - libcontainer container 96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39. Jul 9 14:57:43.166570 systemd[1]: Started cri-containerd-e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead.scope - libcontainer container e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead. Jul 9 14:57:43.213671 kubelet[2442]: I0709 14:57:43.212716 2442 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:43.214377 kubelet[2442]: E0709 14:57:43.214252 2442 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://172.24.4.222:6443/api/v1/nodes\": dial tcp 172.24.4.222:6443: connect: connection refused" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:43.249554 kubelet[2442]: W0709 14:57:43.249396 2442 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.222:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.222:6443: connect: connection refused Jul 9 14:57:43.249554 kubelet[2442]: E0709 14:57:43.249503 2442 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.222:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:43.264315 containerd[1558]: time="2025-07-09T14:57:43.264255597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal,Uid:422dcf2db8ddfe2950af52e03de96b6e,Namespace:kube-system,Attempt:0,} returns sandbox id \"3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2\"" Jul 9 14:57:43.269281 containerd[1558]: time="2025-07-09T14:57:43.269244629Z" level=info msg="CreateContainer within sandbox \"3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 9 14:57:43.277828 kubelet[2442]: W0709 14:57:43.277651 2442 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.222:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.222:6443: connect: connection refused Jul 9 14:57:43.278111 kubelet[2442]: E0709 14:57:43.278072 2442 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.222:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.222:6443: connect: connection refused" logger="UnhandledError" Jul 9 14:57:43.280851 containerd[1558]: time="2025-07-09T14:57:43.280794381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal,Uid:630bf591e71c48107e78e5a1a76f69ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\"" Jul 9 14:57:43.288040 containerd[1558]: time="2025-07-09T14:57:43.287371233Z" level=info msg="CreateContainer within sandbox \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 9 14:57:43.289475 containerd[1558]: time="2025-07-09T14:57:43.288358796Z" level=info msg="Container 58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:57:43.310122 containerd[1558]: time="2025-07-09T14:57:43.310079184Z" level=info msg="CreateContainer within sandbox \"3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58\"" Jul 9 14:57:43.311378 containerd[1558]: time="2025-07-09T14:57:43.311331504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal,Uid:32c56ef1f36dd2118846a7ee71fd2749,Namespace:kube-system,Attempt:0,} returns sandbox id \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\"" Jul 9 14:57:43.313177 containerd[1558]: time="2025-07-09T14:57:43.312325639Z" level=info msg="StartContainer for \"58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58\"" Jul 9 14:57:43.316221 containerd[1558]: time="2025-07-09T14:57:43.315626774Z" level=info msg="CreateContainer within sandbox \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 9 14:57:43.317472 containerd[1558]: time="2025-07-09T14:57:43.317411673Z" level=info msg="Container ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:57:43.317703 containerd[1558]: time="2025-07-09T14:57:43.317675628Z" level=info msg="connecting to shim 58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58" address="unix:///run/containerd/s/dd5b7db94f8ac57f021c3e35bfd25014c8538d733033a12e7024f445b07e41a5" protocol=ttrpc version=3 Jul 9 14:57:43.339573 containerd[1558]: time="2025-07-09T14:57:43.339494221Z" level=info msg="Container 66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:57:43.340428 containerd[1558]: time="2025-07-09T14:57:43.339559423Z" level=info msg="CreateContainer within sandbox \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\"" Jul 9 14:57:43.340873 containerd[1558]: time="2025-07-09T14:57:43.340824447Z" level=info msg="StartContainer for \"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\"" Jul 9 14:57:43.341723 systemd[1]: Started cri-containerd-58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58.scope - libcontainer container 58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58. Jul 9 14:57:43.343209 containerd[1558]: time="2025-07-09T14:57:43.343153727Z" level=info msg="connecting to shim ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" address="unix:///run/containerd/s/6dcc86629619ae9c68f0b9972b137ba0897f120b3d1d0192184fc1f2c28250cd" protocol=ttrpc version=3 Jul 9 14:57:43.360286 containerd[1558]: time="2025-07-09T14:57:43.359977187Z" level=info msg="CreateContainer within sandbox \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\"" Jul 9 14:57:43.360838 containerd[1558]: time="2025-07-09T14:57:43.360808417Z" level=info msg="StartContainer for \"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\"" Jul 9 14:57:43.362482 containerd[1558]: time="2025-07-09T14:57:43.362286890Z" level=info msg="connecting to shim 66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d" address="unix:///run/containerd/s/373b54ba6f381b4533cf616f65bd7adc560faa2f543308a3604164117a1a93d9" protocol=ttrpc version=3 Jul 9 14:57:43.371678 systemd[1]: Started cri-containerd-ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1.scope - libcontainer container ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1. Jul 9 14:57:43.400607 systemd[1]: Started cri-containerd-66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d.scope - libcontainer container 66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d. Jul 9 14:57:43.475634 containerd[1558]: time="2025-07-09T14:57:43.475578049Z" level=info msg="StartContainer for \"58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58\" returns successfully" Jul 9 14:57:43.486925 containerd[1558]: time="2025-07-09T14:57:43.486871982Z" level=info msg="StartContainer for \"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" returns successfully" Jul 9 14:57:43.516885 containerd[1558]: time="2025-07-09T14:57:43.516738747Z" level=info msg="StartContainer for \"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" returns successfully" Jul 9 14:57:44.017498 kubelet[2442]: I0709 14:57:44.017441 2442 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:46.359049 kubelet[2442]: I0709 14:57:46.358813 2442 apiserver.go:52] "Watching apiserver" Jul 9 14:57:46.363371 kubelet[2442]: E0709 14:57:46.359168 2442 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-9999-9-100-bf645a1a30.novalocal\" not found" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:46.386328 kubelet[2442]: I0709 14:57:46.386249 2442 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 9 14:57:46.481000 kubelet[2442]: I0709 14:57:46.480302 2442 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:46.481000 kubelet[2442]: E0709 14:57:46.480356 2442 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-9999-9-100-bf645a1a30.novalocal\": node \"ci-9999-9-100-bf645a1a30.novalocal\" not found" Jul 9 14:57:46.608042 kubelet[2442]: E0709 14:57:46.606953 2442 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:49.163264 systemd[1]: Reload requested from client PID 2708 ('systemctl') (unit session-11.scope)... Jul 9 14:57:49.163346 systemd[1]: Reloading... Jul 9 14:57:49.295520 zram_generator::config[2753]: No configuration found. Jul 9 14:57:49.450666 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 9 14:57:49.640843 systemd[1]: Reloading finished in 476 ms. Jul 9 14:57:49.687233 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:49.702075 systemd[1]: kubelet.service: Deactivated successfully. Jul 9 14:57:49.703024 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:49.703125 systemd[1]: kubelet.service: Consumed 2.005s CPU time, 131.3M memory peak. Jul 9 14:57:49.708699 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 9 14:57:50.296132 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 9 14:57:50.306807 (kubelet)[2817]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 9 14:57:50.383752 kubelet[2817]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 14:57:50.383752 kubelet[2817]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 9 14:57:50.383752 kubelet[2817]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 9 14:57:50.384216 kubelet[2817]: I0709 14:57:50.383839 2817 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 9 14:57:50.415485 kubelet[2817]: I0709 14:57:50.415386 2817 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 9 14:57:50.415485 kubelet[2817]: I0709 14:57:50.415427 2817 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 9 14:57:50.416566 kubelet[2817]: I0709 14:57:50.415925 2817 server.go:934] "Client rotation is on, will bootstrap in background" Jul 9 14:57:50.421484 kubelet[2817]: I0709 14:57:50.420087 2817 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 9 14:57:50.425790 kubelet[2817]: I0709 14:57:50.425581 2817 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 9 14:57:50.436032 kubelet[2817]: I0709 14:57:50.435727 2817 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 9 14:57:50.442162 kubelet[2817]: I0709 14:57:50.442099 2817 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 9 14:57:50.446107 kubelet[2817]: I0709 14:57:50.443258 2817 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 9 14:57:50.446107 kubelet[2817]: I0709 14:57:50.443439 2817 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 9 14:57:50.446107 kubelet[2817]: I0709 14:57:50.443496 2817 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-9-100-bf645a1a30.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 9 14:57:50.446107 kubelet[2817]: I0709 14:57:50.443771 2817 topology_manager.go:138] "Creating topology manager with none policy" Jul 9 14:57:50.446574 kubelet[2817]: I0709 14:57:50.443790 2817 container_manager_linux.go:300] "Creating device plugin manager" Jul 9 14:57:50.446574 kubelet[2817]: I0709 14:57:50.443882 2817 state_mem.go:36] "Initialized new in-memory state store" Jul 9 14:57:50.446574 kubelet[2817]: I0709 14:57:50.444090 2817 kubelet.go:408] "Attempting to sync node with API server" Jul 9 14:57:50.446574 kubelet[2817]: I0709 14:57:50.444112 2817 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 9 14:57:50.446574 kubelet[2817]: I0709 14:57:50.444163 2817 kubelet.go:314] "Adding apiserver pod source" Jul 9 14:57:50.446574 kubelet[2817]: I0709 14:57:50.444195 2817 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 9 14:57:50.448410 kubelet[2817]: I0709 14:57:50.448385 2817 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 9 14:57:50.449036 kubelet[2817]: I0709 14:57:50.449020 2817 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 9 14:57:50.450025 kubelet[2817]: I0709 14:57:50.450007 2817 server.go:1274] "Started kubelet" Jul 9 14:57:50.454184 kubelet[2817]: I0709 14:57:50.454110 2817 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 9 14:57:50.462161 kubelet[2817]: I0709 14:57:50.462108 2817 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 9 14:57:50.462855 kubelet[2817]: I0709 14:57:50.462742 2817 server.go:449] "Adding debug handlers to kubelet server" Jul 9 14:57:50.463197 kubelet[2817]: I0709 14:57:50.463179 2817 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 9 14:57:50.477016 kubelet[2817]: I0709 14:57:50.476994 2817 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 9 14:57:50.491059 kubelet[2817]: E0709 14:57:50.491030 2817 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 9 14:57:50.494497 kubelet[2817]: I0709 14:57:50.494440 2817 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 9 14:57:50.495487 kubelet[2817]: I0709 14:57:50.495434 2817 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 9 14:57:50.497022 kubelet[2817]: I0709 14:57:50.496996 2817 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 9 14:57:50.497251 kubelet[2817]: I0709 14:57:50.497232 2817 reconciler.go:26] "Reconciler: start to sync state" Jul 9 14:57:50.501055 kubelet[2817]: I0709 14:57:50.500998 2817 factory.go:221] Registration of the systemd container factory successfully Jul 9 14:57:50.501805 kubelet[2817]: I0709 14:57:50.501096 2817 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 9 14:57:50.504056 kubelet[2817]: I0709 14:57:50.503965 2817 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 9 14:57:50.505408 kubelet[2817]: I0709 14:57:50.505268 2817 factory.go:221] Registration of the containerd container factory successfully Jul 9 14:57:50.506749 kubelet[2817]: I0709 14:57:50.505859 2817 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 9 14:57:50.507100 kubelet[2817]: I0709 14:57:50.507016 2817 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 9 14:57:50.507100 kubelet[2817]: I0709 14:57:50.507062 2817 kubelet.go:2321] "Starting kubelet main sync loop" Jul 9 14:57:50.507596 kubelet[2817]: E0709 14:57:50.507574 2817 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 9 14:57:50.563248 kubelet[2817]: I0709 14:57:50.562356 2817 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 9 14:57:50.563248 kubelet[2817]: I0709 14:57:50.562420 2817 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 9 14:57:50.563248 kubelet[2817]: I0709 14:57:50.562477 2817 state_mem.go:36] "Initialized new in-memory state store" Jul 9 14:57:50.563248 kubelet[2817]: I0709 14:57:50.563018 2817 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 9 14:57:50.563248 kubelet[2817]: I0709 14:57:50.563033 2817 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 9 14:57:50.563248 kubelet[2817]: I0709 14:57:50.563114 2817 policy_none.go:49] "None policy: Start" Jul 9 14:57:50.565490 kubelet[2817]: I0709 14:57:50.565310 2817 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 9 14:57:50.565490 kubelet[2817]: I0709 14:57:50.565356 2817 state_mem.go:35] "Initializing new in-memory state store" Jul 9 14:57:50.565680 kubelet[2817]: I0709 14:57:50.565594 2817 state_mem.go:75] "Updated machine memory state" Jul 9 14:57:50.576295 kubelet[2817]: I0709 14:57:50.576186 2817 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 9 14:57:50.576657 kubelet[2817]: I0709 14:57:50.576640 2817 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 9 14:57:50.577032 kubelet[2817]: I0709 14:57:50.576988 2817 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 9 14:57:50.577740 kubelet[2817]: I0709 14:57:50.577368 2817 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 9 14:57:50.652065 kubelet[2817]: W0709 14:57:50.652024 2817 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 9 14:57:50.653049 kubelet[2817]: W0709 14:57:50.653031 2817 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 9 14:57:50.653235 kubelet[2817]: W0709 14:57:50.653186 2817 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 9 14:57:50.686021 kubelet[2817]: I0709 14:57:50.685989 2817 kubelet_node_status.go:72] "Attempting to register node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.698639 kubelet[2817]: I0709 14:57:50.698474 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-kubeconfig\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.698639 kubelet[2817]: I0709 14:57:50.698518 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/32c56ef1f36dd2118846a7ee71fd2749-kubeconfig\") pod \"kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"32c56ef1f36dd2118846a7ee71fd2749\") " pod="kube-system/kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.698639 kubelet[2817]: I0709 14:57:50.698545 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.698639 kubelet[2817]: I0709 14:57:50.698566 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/422dcf2db8ddfe2950af52e03de96b6e-k8s-certs\") pod \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"422dcf2db8ddfe2950af52e03de96b6e\") " pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.699103 kubelet[2817]: I0709 14:57:50.698585 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/422dcf2db8ddfe2950af52e03de96b6e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"422dcf2db8ddfe2950af52e03de96b6e\") " pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.699103 kubelet[2817]: I0709 14:57:50.698606 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-ca-certs\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.699103 kubelet[2817]: I0709 14:57:50.698893 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-k8s-certs\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.699103 kubelet[2817]: I0709 14:57:50.698938 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/630bf591e71c48107e78e5a1a76f69ec-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"630bf591e71c48107e78e5a1a76f69ec\") " pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.699259 kubelet[2817]: I0709 14:57:50.698958 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/422dcf2db8ddfe2950af52e03de96b6e-ca-certs\") pod \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" (UID: \"422dcf2db8ddfe2950af52e03de96b6e\") " pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.753215 kubelet[2817]: I0709 14:57:50.752514 2817 kubelet_node_status.go:111] "Node was previously registered" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:50.753215 kubelet[2817]: I0709 14:57:50.752697 2817 kubelet_node_status.go:75] "Successfully registered node" node="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:51.449840 kubelet[2817]: I0709 14:57:51.449260 2817 apiserver.go:52] "Watching apiserver" Jul 9 14:57:51.498394 kubelet[2817]: I0709 14:57:51.498266 2817 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 9 14:57:51.634436 kubelet[2817]: W0709 14:57:51.634355 2817 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jul 9 14:57:51.635959 kubelet[2817]: E0709 14:57:51.635367 2817 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" Jul 9 14:57:51.706408 kubelet[2817]: I0709 14:57:51.706149 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-9-100-bf645a1a30.novalocal" podStartSLOduration=1.706117984 podStartE2EDuration="1.706117984s" podCreationTimestamp="2025-07-09 14:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 14:57:51.706073189 +0000 UTC m=+1.389373901" watchObservedRunningTime="2025-07-09 14:57:51.706117984 +0000 UTC m=+1.389418685" Jul 9 14:57:51.722072 kubelet[2817]: I0709 14:57:51.721881 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" podStartSLOduration=1.721859636 podStartE2EDuration="1.721859636s" podCreationTimestamp="2025-07-09 14:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 14:57:51.721082277 +0000 UTC m=+1.404382998" watchObservedRunningTime="2025-07-09 14:57:51.721859636 +0000 UTC m=+1.405160337" Jul 9 14:57:51.744836 kubelet[2817]: I0709 14:57:51.744764 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-9-100-bf645a1a30.novalocal" podStartSLOduration=1.744737727 podStartE2EDuration="1.744737727s" podCreationTimestamp="2025-07-09 14:57:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 14:57:51.742827694 +0000 UTC m=+1.426128395" watchObservedRunningTime="2025-07-09 14:57:51.744737727 +0000 UTC m=+1.428038448" Jul 9 14:57:54.506636 kubelet[2817]: I0709 14:57:54.506579 2817 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 9 14:57:54.510219 containerd[1558]: time="2025-07-09T14:57:54.509959467Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 9 14:57:54.511907 kubelet[2817]: I0709 14:57:54.511693 2817 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 9 14:57:55.567096 systemd[1]: Created slice kubepods-besteffort-poda3582226_a0bd_40ef_b3cb_800ac0588065.slice - libcontainer container kubepods-besteffort-poda3582226_a0bd_40ef_b3cb_800ac0588065.slice. Jul 9 14:57:55.649480 kubelet[2817]: I0709 14:57:55.648831 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3582226-a0bd-40ef-b3cb-800ac0588065-lib-modules\") pod \"kube-proxy-xh2cb\" (UID: \"a3582226-a0bd-40ef-b3cb-800ac0588065\") " pod="kube-system/kube-proxy-xh2cb" Jul 9 14:57:55.649480 kubelet[2817]: I0709 14:57:55.648892 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a3582226-a0bd-40ef-b3cb-800ac0588065-kube-proxy\") pod \"kube-proxy-xh2cb\" (UID: \"a3582226-a0bd-40ef-b3cb-800ac0588065\") " pod="kube-system/kube-proxy-xh2cb" Jul 9 14:57:55.649480 kubelet[2817]: I0709 14:57:55.648924 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a3582226-a0bd-40ef-b3cb-800ac0588065-xtables-lock\") pod \"kube-proxy-xh2cb\" (UID: \"a3582226-a0bd-40ef-b3cb-800ac0588065\") " pod="kube-system/kube-proxy-xh2cb" Jul 9 14:57:55.649480 kubelet[2817]: I0709 14:57:55.648957 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngqhm\" (UniqueName: \"kubernetes.io/projected/a3582226-a0bd-40ef-b3cb-800ac0588065-kube-api-access-ngqhm\") pod \"kube-proxy-xh2cb\" (UID: \"a3582226-a0bd-40ef-b3cb-800ac0588065\") " pod="kube-system/kube-proxy-xh2cb" Jul 9 14:57:55.802622 systemd[1]: Created slice kubepods-besteffort-pod85d1f0d4_d203_460d_b655_273a9e721bdc.slice - libcontainer container kubepods-besteffort-pod85d1f0d4_d203_460d_b655_273a9e721bdc.slice. Jul 9 14:57:55.863609 kubelet[2817]: I0709 14:57:55.863523 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4x6g\" (UniqueName: \"kubernetes.io/projected/85d1f0d4-d203-460d-b655-273a9e721bdc-kube-api-access-t4x6g\") pod \"tigera-operator-5bf8dfcb4-lhj85\" (UID: \"85d1f0d4-d203-460d-b655-273a9e721bdc\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-lhj85" Jul 9 14:57:55.863609 kubelet[2817]: I0709 14:57:55.863610 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/85d1f0d4-d203-460d-b655-273a9e721bdc-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-lhj85\" (UID: \"85d1f0d4-d203-460d-b655-273a9e721bdc\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-lhj85" Jul 9 14:57:55.881902 containerd[1558]: time="2025-07-09T14:57:55.881801110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xh2cb,Uid:a3582226-a0bd-40ef-b3cb-800ac0588065,Namespace:kube-system,Attempt:0,}" Jul 9 14:57:55.939219 containerd[1558]: time="2025-07-09T14:57:55.939070273Z" level=info msg="connecting to shim b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565" address="unix:///run/containerd/s/1223c50c6b738bb707157b39b35774cc4df7709fb272f8e86f29081302ecc0a2" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:57:56.023698 systemd[1]: Started cri-containerd-b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565.scope - libcontainer container b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565. Jul 9 14:57:56.074382 containerd[1558]: time="2025-07-09T14:57:56.074261925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xh2cb,Uid:a3582226-a0bd-40ef-b3cb-800ac0588065,Namespace:kube-system,Attempt:0,} returns sandbox id \"b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565\"" Jul 9 14:57:56.081160 containerd[1558]: time="2025-07-09T14:57:56.081017898Z" level=info msg="CreateContainer within sandbox \"b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 9 14:57:56.100441 containerd[1558]: time="2025-07-09T14:57:56.100386914Z" level=info msg="Container 0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:57:56.108789 containerd[1558]: time="2025-07-09T14:57:56.108710127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-lhj85,Uid:85d1f0d4-d203-460d-b655-273a9e721bdc,Namespace:tigera-operator,Attempt:0,}" Jul 9 14:57:56.121894 containerd[1558]: time="2025-07-09T14:57:56.121702690Z" level=info msg="CreateContainer within sandbox \"b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8\"" Jul 9 14:57:56.126475 containerd[1558]: time="2025-07-09T14:57:56.125625819Z" level=info msg="StartContainer for \"0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8\"" Jul 9 14:57:56.130041 containerd[1558]: time="2025-07-09T14:57:56.129973485Z" level=info msg="connecting to shim 0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8" address="unix:///run/containerd/s/1223c50c6b738bb707157b39b35774cc4df7709fb272f8e86f29081302ecc0a2" protocol=ttrpc version=3 Jul 9 14:57:56.163630 systemd[1]: Started cri-containerd-0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8.scope - libcontainer container 0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8. Jul 9 14:57:56.164241 containerd[1558]: time="2025-07-09T14:57:56.164189983Z" level=info msg="connecting to shim 34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89" address="unix:///run/containerd/s/2cbdfe341178e08aaffb918245755042099acbdbbc5d0bb9c5629dad14c7d015" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:57:56.207624 systemd[1]: Started cri-containerd-34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89.scope - libcontainer container 34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89. Jul 9 14:57:56.254208 containerd[1558]: time="2025-07-09T14:57:56.254155251Z" level=info msg="StartContainer for \"0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8\" returns successfully" Jul 9 14:57:56.285316 containerd[1558]: time="2025-07-09T14:57:56.285270380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-lhj85,Uid:85d1f0d4-d203-460d-b655-273a9e721bdc,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\"" Jul 9 14:57:56.290270 containerd[1558]: time="2025-07-09T14:57:56.289883584Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 9 14:57:56.600634 kubelet[2817]: I0709 14:57:56.599906 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xh2cb" podStartSLOduration=1.5997521940000001 podStartE2EDuration="1.599752194s" podCreationTimestamp="2025-07-09 14:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 14:57:56.598048769 +0000 UTC m=+6.281349560" watchObservedRunningTime="2025-07-09 14:57:56.599752194 +0000 UTC m=+6.283052905" Jul 9 14:57:56.842592 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3578739117.mount: Deactivated successfully. Jul 9 14:57:58.461185 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2668767827.mount: Deactivated successfully. Jul 9 14:57:59.697660 containerd[1558]: time="2025-07-09T14:57:59.697551809Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:59.699783 containerd[1558]: time="2025-07-09T14:57:59.699756114Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 9 14:57:59.700434 containerd[1558]: time="2025-07-09T14:57:59.700384653Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:59.704176 containerd[1558]: time="2025-07-09T14:57:59.704113698Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:57:59.704929 containerd[1558]: time="2025-07-09T14:57:59.704826484Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 3.414874572s" Jul 9 14:57:59.704929 containerd[1558]: time="2025-07-09T14:57:59.704895053Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 9 14:57:59.710976 containerd[1558]: time="2025-07-09T14:57:59.710864691Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 9 14:57:59.733403 containerd[1558]: time="2025-07-09T14:57:59.733294827Z" level=info msg="Container e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:57:59.756165 containerd[1558]: time="2025-07-09T14:57:59.756116597Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\"" Jul 9 14:57:59.757808 containerd[1558]: time="2025-07-09T14:57:59.757687785Z" level=info msg="StartContainer for \"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\"" Jul 9 14:57:59.759919 containerd[1558]: time="2025-07-09T14:57:59.759866942Z" level=info msg="connecting to shim e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e" address="unix:///run/containerd/s/2cbdfe341178e08aaffb918245755042099acbdbbc5d0bb9c5629dad14c7d015" protocol=ttrpc version=3 Jul 9 14:57:59.841395 systemd[1]: Started cri-containerd-e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e.scope - libcontainer container e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e. Jul 9 14:57:59.998937 containerd[1558]: time="2025-07-09T14:57:59.998688946Z" level=info msg="StartContainer for \"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" returns successfully" Jul 9 14:58:00.718217 kubelet[2817]: I0709 14:58:00.717873 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-lhj85" podStartSLOduration=2.298446649 podStartE2EDuration="5.717758986s" podCreationTimestamp="2025-07-09 14:57:55 +0000 UTC" firstStartedPulling="2025-07-09 14:57:56.287311219 +0000 UTC m=+5.970611930" lastFinishedPulling="2025-07-09 14:57:59.706623556 +0000 UTC m=+9.389924267" observedRunningTime="2025-07-09 14:58:00.717705817 +0000 UTC m=+10.401006538" watchObservedRunningTime="2025-07-09 14:58:00.717758986 +0000 UTC m=+10.401059698" Jul 9 14:58:08.252425 sudo[1851]: pam_unix(sudo:session): session closed for user root Jul 9 14:58:08.534731 sshd[1849]: Connection closed by 172.24.4.1 port 39996 Jul 9 14:58:08.534186 sshd-session[1832]: pam_unix(sshd:session): session closed for user core Jul 9 14:58:08.548381 systemd[1]: sshd@8-172.24.4.222:22-172.24.4.1:39996.service: Deactivated successfully. Jul 9 14:58:08.561394 systemd[1]: session-11.scope: Deactivated successfully. Jul 9 14:58:08.563519 systemd[1]: session-11.scope: Consumed 8.691s CPU time, 231M memory peak. Jul 9 14:58:08.568795 systemd-logind[1533]: Session 11 logged out. Waiting for processes to exit. Jul 9 14:58:08.571192 systemd-logind[1533]: Removed session 11. Jul 9 14:58:13.893388 systemd[1]: Created slice kubepods-besteffort-pod1f4bd092_2b1f_4cd5_81dc_313f7b55b38a.slice - libcontainer container kubepods-besteffort-pod1f4bd092_2b1f_4cd5_81dc_313f7b55b38a.slice. Jul 9 14:58:13.944396 kubelet[2817]: I0709 14:58:13.944159 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7fpx\" (UniqueName: \"kubernetes.io/projected/1f4bd092-2b1f-4cd5-81dc-313f7b55b38a-kube-api-access-n7fpx\") pod \"calico-typha-7cf446b48-r6jhs\" (UID: \"1f4bd092-2b1f-4cd5-81dc-313f7b55b38a\") " pod="calico-system/calico-typha-7cf446b48-r6jhs" Jul 9 14:58:13.946219 kubelet[2817]: I0709 14:58:13.944702 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1f4bd092-2b1f-4cd5-81dc-313f7b55b38a-typha-certs\") pod \"calico-typha-7cf446b48-r6jhs\" (UID: \"1f4bd092-2b1f-4cd5-81dc-313f7b55b38a\") " pod="calico-system/calico-typha-7cf446b48-r6jhs" Jul 9 14:58:13.946219 kubelet[2817]: I0709 14:58:13.944973 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f4bd092-2b1f-4cd5-81dc-313f7b55b38a-tigera-ca-bundle\") pod \"calico-typha-7cf446b48-r6jhs\" (UID: \"1f4bd092-2b1f-4cd5-81dc-313f7b55b38a\") " pod="calico-system/calico-typha-7cf446b48-r6jhs" Jul 9 14:58:14.202441 containerd[1558]: time="2025-07-09T14:58:14.202148891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cf446b48-r6jhs,Uid:1f4bd092-2b1f-4cd5-81dc-313f7b55b38a,Namespace:calico-system,Attempt:0,}" Jul 9 14:58:14.268675 systemd[1]: Created slice kubepods-besteffort-pod36d23b64_d4b5_412f_b96c_ed79214183e6.slice - libcontainer container kubepods-besteffort-pod36d23b64_d4b5_412f_b96c_ed79214183e6.slice. Jul 9 14:58:14.283874 containerd[1558]: time="2025-07-09T14:58:14.283790651Z" level=info msg="connecting to shim 6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c" address="unix:///run/containerd/s/46e42033a9a38a814af42db25f0793ac76a81f0820f7783ad9417eb33e3967f5" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:58:14.349487 kubelet[2817]: I0709 14:58:14.348694 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-lib-modules\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349487 kubelet[2817]: I0709 14:58:14.348774 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-cni-net-dir\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349487 kubelet[2817]: I0709 14:58:14.348799 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-policysync\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349487 kubelet[2817]: I0709 14:58:14.348826 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-xtables-lock\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349487 kubelet[2817]: I0709 14:58:14.348850 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlcdt\" (UniqueName: \"kubernetes.io/projected/36d23b64-d4b5-412f-b96c-ed79214183e6-kube-api-access-tlcdt\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349881 kubelet[2817]: I0709 14:58:14.348874 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-var-run-calico\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349881 kubelet[2817]: I0709 14:58:14.348967 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-cni-log-dir\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349881 kubelet[2817]: I0709 14:58:14.349021 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/36d23b64-d4b5-412f-b96c-ed79214183e6-node-certs\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349881 kubelet[2817]: I0709 14:58:14.349051 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d23b64-d4b5-412f-b96c-ed79214183e6-tigera-ca-bundle\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.349881 kubelet[2817]: I0709 14:58:14.349082 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-var-lib-calico\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.350070 kubelet[2817]: I0709 14:58:14.349117 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-cni-bin-dir\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.350070 kubelet[2817]: I0709 14:58:14.349149 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/36d23b64-d4b5-412f-b96c-ed79214183e6-flexvol-driver-host\") pod \"calico-node-4mm6r\" (UID: \"36d23b64-d4b5-412f-b96c-ed79214183e6\") " pod="calico-system/calico-node-4mm6r" Jul 9 14:58:14.387362 systemd[1]: Started cri-containerd-6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c.scope - libcontainer container 6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c. Jul 9 14:58:14.464630 kubelet[2817]: E0709 14:58:14.461921 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.464630 kubelet[2817]: W0709 14:58:14.461966 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.464630 kubelet[2817]: E0709 14:58:14.462054 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.491224 kubelet[2817]: E0709 14:58:14.491038 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.491224 kubelet[2817]: W0709 14:58:14.491075 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.491224 kubelet[2817]: E0709 14:58:14.491137 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.542255 kubelet[2817]: E0709 14:58:14.542175 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:14.549526 kubelet[2817]: E0709 14:58:14.549223 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.549526 kubelet[2817]: W0709 14:58:14.549269 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.549526 kubelet[2817]: E0709 14:58:14.549307 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.549987 kubelet[2817]: E0709 14:58:14.549967 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.550108 kubelet[2817]: W0709 14:58:14.550092 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.550288 kubelet[2817]: E0709 14:58:14.550183 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.550681 kubelet[2817]: E0709 14:58:14.550507 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.550681 kubelet[2817]: W0709 14:58:14.550523 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.550681 kubelet[2817]: E0709 14:58:14.550541 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.551280 kubelet[2817]: E0709 14:58:14.551253 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.551481 kubelet[2817]: W0709 14:58:14.551426 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.551628 kubelet[2817]: E0709 14:58:14.551585 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.553494 kubelet[2817]: E0709 14:58:14.553321 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.553494 kubelet[2817]: W0709 14:58:14.553350 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.553494 kubelet[2817]: E0709 14:58:14.553375 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.554010 kubelet[2817]: E0709 14:58:14.553885 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.554010 kubelet[2817]: W0709 14:58:14.553900 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.554010 kubelet[2817]: E0709 14:58:14.553914 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.554279 kubelet[2817]: E0709 14:58:14.554257 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.554578 kubelet[2817]: W0709 14:58:14.554381 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.554578 kubelet[2817]: E0709 14:58:14.554402 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.555239 kubelet[2817]: E0709 14:58:14.554831 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.555567 kubelet[2817]: W0709 14:58:14.555345 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.555567 kubelet[2817]: E0709 14:58:14.555368 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.555884 kubelet[2817]: E0709 14:58:14.555867 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.556238 kubelet[2817]: W0709 14:58:14.555966 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.556238 kubelet[2817]: E0709 14:58:14.555985 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.556705 kubelet[2817]: E0709 14:58:14.556511 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.556705 kubelet[2817]: W0709 14:58:14.556526 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.556705 kubelet[2817]: E0709 14:58:14.556539 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.557658 kubelet[2817]: E0709 14:58:14.557477 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.557658 kubelet[2817]: W0709 14:58:14.557522 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.557658 kubelet[2817]: E0709 14:58:14.557535 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.557906 kubelet[2817]: E0709 14:58:14.557890 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.558145 kubelet[2817]: W0709 14:58:14.557985 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.558145 kubelet[2817]: E0709 14:58:14.558005 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.558368 kubelet[2817]: E0709 14:58:14.558351 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.558537 kubelet[2817]: W0709 14:58:14.558514 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.559397 kubelet[2817]: E0709 14:58:14.559277 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.559587 kubelet[2817]: E0709 14:58:14.559570 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.559727 kubelet[2817]: W0709 14:58:14.559707 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.559920 kubelet[2817]: E0709 14:58:14.559815 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.560115 kubelet[2817]: E0709 14:58:14.560099 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.560199 kubelet[2817]: W0709 14:58:14.560185 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.560271 kubelet[2817]: E0709 14:58:14.560258 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.560764 kubelet[2817]: E0709 14:58:14.560586 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.560764 kubelet[2817]: W0709 14:58:14.560602 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.560764 kubelet[2817]: E0709 14:58:14.560615 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.561956 kubelet[2817]: E0709 14:58:14.561927 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.562062 kubelet[2817]: W0709 14:58:14.562047 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.562151 kubelet[2817]: E0709 14:58:14.562137 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.562430 kubelet[2817]: E0709 14:58:14.562416 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.562648 kubelet[2817]: W0709 14:58:14.562552 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.562648 kubelet[2817]: E0709 14:58:14.562572 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.562852 kubelet[2817]: E0709 14:58:14.562837 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.563042 kubelet[2817]: W0709 14:58:14.562933 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.563042 kubelet[2817]: E0709 14:58:14.562954 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.563223 kubelet[2817]: E0709 14:58:14.563207 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.563441 kubelet[2817]: W0709 14:58:14.563318 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.563441 kubelet[2817]: E0709 14:58:14.563339 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.564598 kubelet[2817]: E0709 14:58:14.564551 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.564598 kubelet[2817]: W0709 14:58:14.564580 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.564598 kubelet[2817]: E0709 14:58:14.564600 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.564822 kubelet[2817]: I0709 14:58:14.564630 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2052df22-65ee-4914-9e6a-b1a620327c58-kubelet-dir\") pod \"csi-node-driver-sbhtt\" (UID: \"2052df22-65ee-4914-9e6a-b1a620327c58\") " pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:58:14.565636 kubelet[2817]: E0709 14:58:14.565601 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.565636 kubelet[2817]: W0709 14:58:14.565623 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.565944 kubelet[2817]: E0709 14:58:14.565659 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.565944 kubelet[2817]: I0709 14:58:14.565685 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmlhm\" (UniqueName: \"kubernetes.io/projected/2052df22-65ee-4914-9e6a-b1a620327c58-kube-api-access-dmlhm\") pod \"csi-node-driver-sbhtt\" (UID: \"2052df22-65ee-4914-9e6a-b1a620327c58\") " pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:58:14.566119 kubelet[2817]: E0709 14:58:14.565983 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.566119 kubelet[2817]: W0709 14:58:14.565997 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.566119 kubelet[2817]: E0709 14:58:14.566018 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.566352 kubelet[2817]: E0709 14:58:14.566254 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.566352 kubelet[2817]: W0709 14:58:14.566267 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.566352 kubelet[2817]: E0709 14:58:14.566296 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.566693 kubelet[2817]: E0709 14:58:14.566667 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.566693 kubelet[2817]: W0709 14:58:14.566679 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.566832 kubelet[2817]: E0709 14:58:14.566816 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.566878 kubelet[2817]: I0709 14:58:14.566844 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2052df22-65ee-4914-9e6a-b1a620327c58-registration-dir\") pod \"csi-node-driver-sbhtt\" (UID: \"2052df22-65ee-4914-9e6a-b1a620327c58\") " pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:58:14.567431 kubelet[2817]: E0709 14:58:14.567362 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.567431 kubelet[2817]: W0709 14:58:14.567382 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.567431 kubelet[2817]: E0709 14:58:14.567401 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.569561 kubelet[2817]: E0709 14:58:14.568021 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.569561 kubelet[2817]: W0709 14:58:14.568039 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.569561 kubelet[2817]: E0709 14:58:14.568051 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.569561 kubelet[2817]: E0709 14:58:14.568577 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.569561 kubelet[2817]: W0709 14:58:14.568589 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.569561 kubelet[2817]: E0709 14:58:14.568601 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.569561 kubelet[2817]: I0709 14:58:14.568618 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2052df22-65ee-4914-9e6a-b1a620327c58-socket-dir\") pod \"csi-node-driver-sbhtt\" (UID: \"2052df22-65ee-4914-9e6a-b1a620327c58\") " pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:58:14.569561 kubelet[2817]: E0709 14:58:14.569124 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.569561 kubelet[2817]: W0709 14:58:14.569137 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.570065 kubelet[2817]: E0709 14:58:14.569152 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.570065 kubelet[2817]: I0709 14:58:14.569184 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2052df22-65ee-4914-9e6a-b1a620327c58-varrun\") pod \"csi-node-driver-sbhtt\" (UID: \"2052df22-65ee-4914-9e6a-b1a620327c58\") " pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:58:14.570065 kubelet[2817]: E0709 14:58:14.569956 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.570065 kubelet[2817]: W0709 14:58:14.569970 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.570233 kubelet[2817]: E0709 14:58:14.570067 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.570610 kubelet[2817]: E0709 14:58:14.570586 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.570610 kubelet[2817]: W0709 14:58:14.570604 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.570862 kubelet[2817]: E0709 14:58:14.570785 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.570967 kubelet[2817]: E0709 14:58:14.570846 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.571088 kubelet[2817]: W0709 14:58:14.571061 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.571234 kubelet[2817]: E0709 14:58:14.571200 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.571665 kubelet[2817]: E0709 14:58:14.571586 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.571776 kubelet[2817]: W0709 14:58:14.571760 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.571876 kubelet[2817]: E0709 14:58:14.571857 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.572274 kubelet[2817]: E0709 14:58:14.572138 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.572274 kubelet[2817]: W0709 14:58:14.572153 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.572274 kubelet[2817]: E0709 14:58:14.572165 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.572579 kubelet[2817]: E0709 14:58:14.572527 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.572579 kubelet[2817]: W0709 14:58:14.572542 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.572579 kubelet[2817]: E0709 14:58:14.572555 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.576944 containerd[1558]: time="2025-07-09T14:58:14.576900437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4mm6r,Uid:36d23b64-d4b5-412f-b96c-ed79214183e6,Namespace:calico-system,Attempt:0,}" Jul 9 14:58:14.672236 kubelet[2817]: E0709 14:58:14.671567 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.672236 kubelet[2817]: W0709 14:58:14.671602 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.672236 kubelet[2817]: E0709 14:58:14.671633 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.672868 kubelet[2817]: E0709 14:58:14.672836 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.673128 kubelet[2817]: W0709 14:58:14.673055 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.673128 kubelet[2817]: E0709 14:58:14.673088 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.673442 kubelet[2817]: E0709 14:58:14.673367 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.673442 kubelet[2817]: W0709 14:58:14.673406 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.673442 kubelet[2817]: E0709 14:58:14.673437 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.673668 kubelet[2817]: E0709 14:58:14.673637 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.673668 kubelet[2817]: W0709 14:58:14.673655 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.673896 kubelet[2817]: E0709 14:58:14.673684 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.673896 kubelet[2817]: E0709 14:58:14.673839 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.673896 kubelet[2817]: W0709 14:58:14.673849 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.673896 kubelet[2817]: E0709 14:58:14.673865 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.674067 kubelet[2817]: E0709 14:58:14.674051 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.674067 kubelet[2817]: W0709 14:58:14.674062 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.674122 kubelet[2817]: E0709 14:58:14.674072 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.674406 kubelet[2817]: E0709 14:58:14.674357 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.674406 kubelet[2817]: W0709 14:58:14.674374 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.674717 kubelet[2817]: E0709 14:58:14.674432 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.674934 kubelet[2817]: E0709 14:58:14.674736 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.674934 kubelet[2817]: W0709 14:58:14.674749 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.674934 kubelet[2817]: E0709 14:58:14.674760 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.675374 kubelet[2817]: E0709 14:58:14.675139 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.675374 kubelet[2817]: W0709 14:58:14.675151 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.675374 kubelet[2817]: E0709 14:58:14.675169 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.675539 kubelet[2817]: E0709 14:58:14.675501 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.675539 kubelet[2817]: W0709 14:58:14.675512 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.675539 kubelet[2817]: E0709 14:58:14.675534 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.676659 kubelet[2817]: E0709 14:58:14.676575 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.676659 kubelet[2817]: W0709 14:58:14.676593 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.676659 kubelet[2817]: E0709 14:58:14.676629 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.677906 kubelet[2817]: E0709 14:58:14.676818 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.677906 kubelet[2817]: W0709 14:58:14.676829 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.677906 kubelet[2817]: E0709 14:58:14.676863 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.677906 kubelet[2817]: E0709 14:58:14.677547 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.677906 kubelet[2817]: W0709 14:58:14.677559 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.677906 kubelet[2817]: E0709 14:58:14.677730 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.678911 kubelet[2817]: E0709 14:58:14.677976 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.678911 kubelet[2817]: W0709 14:58:14.677989 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.678911 kubelet[2817]: E0709 14:58:14.678492 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.678911 kubelet[2817]: W0709 14:58:14.678503 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.678911 kubelet[2817]: E0709 14:58:14.678516 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.679377 kubelet[2817]: E0709 14:58:14.679112 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.679377 kubelet[2817]: W0709 14:58:14.679130 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.679377 kubelet[2817]: E0709 14:58:14.679142 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.679377 kubelet[2817]: E0709 14:58:14.679263 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.679377 kubelet[2817]: E0709 14:58:14.679364 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.679377 kubelet[2817]: W0709 14:58:14.679376 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.680269 kubelet[2817]: E0709 14:58:14.679536 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.680269 kubelet[2817]: E0709 14:58:14.680053 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.680269 kubelet[2817]: W0709 14:58:14.680066 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.681088 kubelet[2817]: E0709 14:58:14.680518 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.681088 kubelet[2817]: E0709 14:58:14.680835 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.681088 kubelet[2817]: W0709 14:58:14.680849 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.681704 kubelet[2817]: E0709 14:58:14.681539 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.681782 kubelet[2817]: E0709 14:58:14.681708 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.681782 kubelet[2817]: W0709 14:58:14.681719 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.682061 kubelet[2817]: E0709 14:58:14.681976 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.682223 kubelet[2817]: E0709 14:58:14.682191 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.682223 kubelet[2817]: W0709 14:58:14.682206 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.682556 kubelet[2817]: E0709 14:58:14.682500 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.683166 kubelet[2817]: E0709 14:58:14.682661 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.683166 kubelet[2817]: W0709 14:58:14.682677 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.683166 kubelet[2817]: E0709 14:58:14.682706 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.687390 kubelet[2817]: E0709 14:58:14.687016 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.687390 kubelet[2817]: W0709 14:58:14.687077 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.687390 kubelet[2817]: E0709 14:58:14.687112 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.688252 kubelet[2817]: E0709 14:58:14.687787 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.688252 kubelet[2817]: W0709 14:58:14.687833 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.688252 kubelet[2817]: E0709 14:58:14.687847 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.689580 kubelet[2817]: E0709 14:58:14.689549 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.692003 kubelet[2817]: W0709 14:58:14.690301 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.692003 kubelet[2817]: E0709 14:58:14.690327 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.692219 containerd[1558]: time="2025-07-09T14:58:14.690384081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cf446b48-r6jhs,Uid:1f4bd092-2b1f-4cd5-81dc-313f7b55b38a,Namespace:calico-system,Attempt:0,} returns sandbox id \"6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c\"" Jul 9 14:58:14.693868 containerd[1558]: time="2025-07-09T14:58:14.693827729Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 9 14:58:14.694728 containerd[1558]: time="2025-07-09T14:58:14.694661884Z" level=info msg="connecting to shim 09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac" address="unix:///run/containerd/s/c728b76101f95f4b4b808e74a0222358b809720afc928cd20dc9dfddd726edef" namespace=k8s.io protocol=ttrpc version=3 Jul 9 14:58:14.743526 kubelet[2817]: E0709 14:58:14.743346 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:14.743526 kubelet[2817]: W0709 14:58:14.743386 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:14.744498 kubelet[2817]: E0709 14:58:14.744097 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:14.748115 systemd[1]: Started cri-containerd-09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac.scope - libcontainer container 09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac. Jul 9 14:58:14.807440 containerd[1558]: time="2025-07-09T14:58:14.807367839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4mm6r,Uid:36d23b64-d4b5-412f-b96c-ed79214183e6,Namespace:calico-system,Attempt:0,} returns sandbox id \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\"" Jul 9 14:58:16.509745 kubelet[2817]: E0709 14:58:16.508725 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:16.770377 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2997540475.mount: Deactivated successfully. Jul 9 14:58:18.595234 kubelet[2817]: E0709 14:58:18.575938 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:20.541327 kubelet[2817]: E0709 14:58:20.540995 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:22.589429 kubelet[2817]: E0709 14:58:22.552714 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:26.877368 kubelet[2817]: E0709 14:58:24.512485 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:26.877368 kubelet[2817]: E0709 14:58:26.510733 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:28.153820 systemd[1]: cri-containerd-ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1.scope: Deactivated successfully. Jul 9 14:58:28.157102 systemd[1]: cri-containerd-ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1.scope: Consumed 2.897s CPU time, 50.7M memory peak. Jul 9 14:58:28.185717 containerd[1558]: time="2025-07-09T14:58:28.185245323Z" level=info msg="received exit event container_id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" pid:2651 exit_status:1 exited_at:{seconds:1752073108 nanos:180139561}" Jul 9 14:58:28.187279 containerd[1558]: time="2025-07-09T14:58:28.186187937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" pid:2651 exit_status:1 exited_at:{seconds:1752073108 nanos:180139561}" Jul 9 14:58:28.412376 kubelet[2817]: E0709 14:58:28.411063 2817 controller.go:195] "Failed to update lease" err="etcdserver: request timed out" Jul 9 14:58:28.511352 kubelet[2817]: E0709 14:58:28.510785 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:28.617030 systemd[1]: cri-containerd-e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e.scope: Deactivated successfully. Jul 9 14:58:28.617951 systemd[1]: cri-containerd-e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e.scope: Consumed 7.101s CPU time, 80M memory peak. Jul 9 14:58:28.777648 containerd[1558]: time="2025-07-09T14:58:28.775604045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" pid:3140 exit_status:1 exited_at:{seconds:1752073108 nanos:774048008}" Jul 9 14:58:28.777648 containerd[1558]: time="2025-07-09T14:58:28.775956931Z" level=info msg="received exit event container_id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" pid:3140 exit_status:1 exited_at:{seconds:1752073108 nanos:774048008}" Jul 9 14:58:28.828046 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1-rootfs.mount: Deactivated successfully. Jul 9 14:58:28.874159 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e-rootfs.mount: Deactivated successfully. Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:32.009835501Z" level=info msg="TaskExit event in podsandbox handler container_id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" pid:2663 exit_status:1 exited_at:{seconds:1752073112 nanos:8333872}" Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:32.011833597Z" level=info msg="received exit event container_id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" pid:2663 exit_status:1 exited_at:{seconds:1752073112 nanos:8333872}" Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:38.189775549Z" level=error msg="failed to handle container TaskExit event container_id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" pid:2651 exit_status:1 exited_at:{seconds:1752073108 nanos:180139561}" error="failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:38.776916843Z" level=error msg="failed to handle container TaskExit event container_id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" pid:3140 exit_status:1 exited_at:{seconds:1752073108 nanos:774048008}" error="failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:39.435954503Z" level=info msg="TaskExit event container_id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" pid:2651 exit_status:1 exited_at:{seconds:1752073108 nanos:180139561}" Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:41.436534507Z" level=error msg="get state for ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" error="context deadline exceeded" Jul 9 14:58:46.769058 containerd[1558]: time="2025-07-09T14:58:41.436604781Z" level=warning msg="unknown status" status=0 Jul 9 14:58:32.003297 systemd[1]: cri-containerd-66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d.scope: Deactivated successfully. Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:28.879615 2817 kubelet_node_status.go:535] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:58:21Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:58:21Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:58:21Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:58:21Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\\\",\\\"registry.k8s.io/etcd:3.5.15-0\\\"],\\\"sizeBytes\\\":56909194},{\\\"names\\\":[\\\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\\\",\\\"registry.k8s.io/kube-proxy:v1.31.10\\\"],\\\"sizeBytes\\\":30382962},{\\\"names\\\":[\\\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\\\",\\\"registry.k8s.io/kube-apiserver:v1.31.10\\\"],\\\"sizeBytes\\\":28074544},{\\\"names\\\":[\\\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\\\",\\\"registry.k8s.io/kube-controller-manager:v1.31.10\\\"],\\\"sizeBytes\\\":26315128},{\\\"names\\\":[\\\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\\\",\\\"quay.io/tigera/operator:v1.38.3\\\"],\\\"sizeBytes\\\":25052538},{\\\"names\\\":[\\\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\\\",\\\"registry.k8s.io/kube-scheduler:v1.31.10\\\"],\\\"sizeBytes\\\":20385523},{\\\"names\\\":[\\\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\\\",\\\"registry.k8s.io/coredns/coredns:v1.11.3\\\"],\\\"sizeBytes\\\":18562039},{\\\"names\\\":[\\\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\\\",\\\"registry.k8s.io/pause:3.10\\\"],\\\"sizeBytes\\\":320368}]}}\" for node \"ci-9999-9-100-bf645a1a30.novalocal\": etcdserver: request timed out" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.254440 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.784002 kubelet[2817]: W0709 14:58:30.254521 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.254640 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.255443 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.784002 kubelet[2817]: W0709 14:58:30.255499 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.255528 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.255917 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.784002 kubelet[2817]: W0709 14:58:30.255933 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.255991 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.256320 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.784002 kubelet[2817]: W0709 14:58:30.256332 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.256366 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.256726 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.784002 kubelet[2817]: W0709 14:58:30.256740 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.256754 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.257113 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.784002 kubelet[2817]: W0709 14:58:30.257165 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.257181 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.784002 kubelet[2817]: E0709 14:58:30.257485 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790052 containerd[1558]: time="2025-07-09T14:58:42.012741039Z" level=error msg="failed to handle container TaskExit event container_id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" pid:2663 exit_status:1 exited_at:{seconds:1752073112 nanos:8333872}" error="failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:58:46.790052 containerd[1558]: time="2025-07-09T14:58:43.437983454Z" level=error msg="get state for ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" error="context deadline exceeded" Jul 9 14:58:46.790052 containerd[1558]: time="2025-07-09T14:58:43.438041183Z" level=warning msg="unknown status" status=0 Jul 9 14:58:46.790052 containerd[1558]: time="2025-07-09T14:58:45.886098675Z" level=error msg="get state for ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" error="context deadline exceeded" Jul 9 14:58:46.790052 containerd[1558]: time="2025-07-09T14:58:45.886465883Z" level=warning msg="unknown status" status=0 Jul 9 14:58:32.005916 systemd[1]: cri-containerd-66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d.scope: Consumed 1.539s CPU time, 18M memory peak. Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.257508 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.257522 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.257786 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.257798 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.257810 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258078 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.258102 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258116 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258361 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.258373 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258384 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258671 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.258683 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258695 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258906 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.258929 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.258955 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.259407 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.259419 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.259431 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.259660 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.259671 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.259683 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.259925 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:46.790531 kubelet[2817]: W0709 14:58:30.259948 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.259961 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:30.507897 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:31.978980 2817 controller.go:195] "Failed to update lease" err="Operation cannot be fulfilled on leases.coordination.k8s.io \"ci-9999-9-100-bf645a1a30.novalocal\": the object has been modified; please apply your changes to the latest version and try again" Jul 9 14:58:46.790531 kubelet[2817]: E0709 14:58:32.508937 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:32.077263 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d-rootfs.mount: Deactivated successfully. Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:34.507893 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:36.507820 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:38.508185 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:38.956632 2817 event.go:359] "Server rejected event (will not retry!)" err="etcdserver: request timed out" event="&Event{ObjectMeta:{csi-node-driver-sbhtt.18509d3254b00395 calico-system 753 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:csi-node-driver-sbhtt,UID:2052df22-65ee-4914-9e6a-b1a620327c58,APIVersion:v1,ResourceVersion:719,FieldPath:,},Reason:NetworkNotReady,Message:network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized,Source:EventSource{Component:kubelet,Host:ci-9999-9-100-bf645a1a30.novalocal,},FirstTimestamp:2025-07-09 14:58:14 +0000 UTC,LastTimestamp:2025-07-09 14:58:18.575866362 +0000 UTC m=+28.259167063,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-9-100-bf645a1a30.novalocal,}" Jul 9 14:58:46.796030 kubelet[2817]: I0709 14:58:39.000740 2817 status_manager.go:875] "Failed to update status for pod" pod="kube-system/kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c767b891-e089-4548-9421-29dc0005e674\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-07-09T14:58:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-07-09T14:58:30Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"containerd://58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58\\\",\\\"image\\\":\\\"registry.k8s.io/kube-apiserver:v1.31.10\\\",\\\"imageID\\\":\\\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-07-09T14:57:43Z\\\"}}}]}}\" for pod \"kube-system\"/\"kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal\": etcdserver: request timed out" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:40.507987 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:41.980212 2817 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-bf645a1a30.novalocal?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:42.509416 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.796030 kubelet[2817]: E0709 14:58:43.953974 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:46.811201 kubelet[2817]: E0709 14:58:46.809204 2817 event.go:359] "Server rejected event (will not retry!)" err="etcdserver: request timed out" event="&Event{ObjectMeta:{csi-node-driver-sbhtt.18509d3254b00395 calico-system 753 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:csi-node-driver-sbhtt,UID:2052df22-65ee-4914-9e6a-b1a620327c58,APIVersion:v1,ResourceVersion:719,FieldPath:,},Reason:NetworkNotReady,Message:network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized,Source:EventSource{Component:kubelet,Host:ci-9999-9-100-bf645a1a30.novalocal,},FirstTimestamp:2025-07-09 14:58:14 +0000 UTC,LastTimestamp:2025-07-09 14:58:20.540890633 +0000 UTC m=+30.224191354,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-9-100-bf645a1a30.novalocal,}" Jul 9 14:58:46.818847 kubelet[2817]: E0709 14:58:46.818643 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:48.308562 kubelet[2817]: E0709 14:58:48.308378 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.309764 kubelet[2817]: W0709 14:58:48.308582 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.309764 kubelet[2817]: E0709 14:58:48.308719 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.310334 kubelet[2817]: E0709 14:58:48.310270 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.310334 kubelet[2817]: W0709 14:58:48.310306 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.310648 kubelet[2817]: E0709 14:58:48.310371 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.311106 kubelet[2817]: E0709 14:58:48.311040 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.311230 kubelet[2817]: W0709 14:58:48.311145 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.311373 kubelet[2817]: E0709 14:58:48.311182 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.312277 kubelet[2817]: E0709 14:58:48.312217 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.312277 kubelet[2817]: W0709 14:58:48.312262 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.312586 kubelet[2817]: E0709 14:58:48.312294 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.313251 kubelet[2817]: E0709 14:58:48.313194 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.313251 kubelet[2817]: W0709 14:58:48.313239 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.313538 kubelet[2817]: E0709 14:58:48.313275 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.313971 kubelet[2817]: E0709 14:58:48.313909 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.313971 kubelet[2817]: W0709 14:58:48.313954 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.314195 kubelet[2817]: E0709 14:58:48.313993 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.314971 kubelet[2817]: E0709 14:58:48.314622 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.314971 kubelet[2817]: W0709 14:58:48.314671 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.314971 kubelet[2817]: E0709 14:58:48.314706 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.315372 kubelet[2817]: E0709 14:58:48.315330 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.315530 kubelet[2817]: W0709 14:58:48.315372 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.315530 kubelet[2817]: E0709 14:58:48.315431 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.316221 kubelet[2817]: E0709 14:58:48.316162 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.316221 kubelet[2817]: W0709 14:58:48.316209 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.316527 kubelet[2817]: E0709 14:58:48.316245 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.316891 kubelet[2817]: E0709 14:58:48.316851 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.317038 kubelet[2817]: W0709 14:58:48.316893 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.317038 kubelet[2817]: E0709 14:58:48.316927 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.317438 kubelet[2817]: E0709 14:58:48.317401 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.317672 kubelet[2817]: W0709 14:58:48.317439 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.317672 kubelet[2817]: E0709 14:58:48.317548 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.318027 kubelet[2817]: E0709 14:58:48.317991 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.318027 kubelet[2817]: W0709 14:58:48.318023 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.318297 kubelet[2817]: E0709 14:58:48.318048 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.318682 kubelet[2817]: E0709 14:58:48.318544 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.318682 kubelet[2817]: W0709 14:58:48.318576 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.318682 kubelet[2817]: E0709 14:58:48.318601 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.319518 kubelet[2817]: E0709 14:58:48.319080 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.319518 kubelet[2817]: W0709 14:58:48.319113 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.319518 kubelet[2817]: E0709 14:58:48.319182 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.319848 kubelet[2817]: E0709 14:58:48.319692 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:48.319848 kubelet[2817]: W0709 14:58:48.319717 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:48.319848 kubelet[2817]: E0709 14:58:48.319741 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:48.511692 kubelet[2817]: E0709 14:58:48.507856 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:49.438504 containerd[1558]: time="2025-07-09T14:58:49.438134353Z" level=error msg="Failed to handle backOff event container_id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" pid:2651 exit_status:1 exited_at:{seconds:1752073108 nanos:180139561} for ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" error="failed to handle container TaskExit event: failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:58:49.438504 containerd[1558]: time="2025-07-09T14:58:49.438417572Z" level=info msg="TaskExit event container_id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" id:\"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" pid:2663 exit_status:1 exited_at:{seconds:1752073112 nanos:8333872}" Jul 9 14:58:50.051616 containerd[1558]: time="2025-07-09T14:58:50.051187903Z" level=error msg="ttrpc: received message on inactive stream" stream=45 Jul 9 14:58:50.051616 containerd[1558]: time="2025-07-09T14:58:50.051581058Z" level=error msg="ttrpc: received message on inactive stream" stream=51 Jul 9 14:58:50.051616 containerd[1558]: time="2025-07-09T14:58:50.051609152Z" level=error msg="ttrpc: received message on inactive stream" stream=41 Jul 9 14:58:50.052574 containerd[1558]: time="2025-07-09T14:58:50.051677952Z" level=error msg="ttrpc: received message on inactive stream" stream=49 Jul 9 14:58:50.054626 containerd[1558]: time="2025-07-09T14:58:50.054292573Z" level=error msg="ttrpc: received message on inactive stream" stream=53 Jul 9 14:58:50.054626 containerd[1558]: time="2025-07-09T14:58:50.054416327Z" level=error msg="ttrpc: received message on inactive stream" stream=37 Jul 9 14:58:50.055364 containerd[1558]: time="2025-07-09T14:58:50.055224521Z" level=error msg="ttrpc: received message on inactive stream" stream=43 Jul 9 14:58:50.081841 containerd[1558]: time="2025-07-09T14:58:50.081763865Z" level=info msg="Ensure that container 66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d in task-service has been cleanup successfully" Jul 9 14:58:50.404165 containerd[1558]: time="2025-07-09T14:58:50.404040449Z" level=info msg="TaskExit event container_id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" id:\"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" pid:3140 exit_status:1 exited_at:{seconds:1752073108 nanos:774048008}" Jul 9 14:58:50.413254 containerd[1558]: time="2025-07-09T14:58:50.413082555Z" level=info msg="Ensure that container e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e in task-service has been cleanup successfully" Jul 9 14:58:50.488933 containerd[1558]: time="2025-07-09T14:58:50.488527481Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 9 14:58:50.492366 containerd[1558]: time="2025-07-09T14:58:50.491417625Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:58:50.510400 containerd[1558]: time="2025-07-09T14:58:50.510351479Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:58:50.510755 kubelet[2817]: E0709 14:58:50.510690 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:50.545239 containerd[1558]: time="2025-07-09T14:58:50.545190080Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:58:50.546760 containerd[1558]: time="2025-07-09T14:58:50.546162876Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 35.852262921s" Jul 9 14:58:50.546859 containerd[1558]: time="2025-07-09T14:58:50.546772002Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 9 14:58:50.551054 containerd[1558]: time="2025-07-09T14:58:50.550973925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 9 14:58:50.579645 containerd[1558]: time="2025-07-09T14:58:50.579548209Z" level=info msg="CreateContainer within sandbox \"6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 9 14:58:50.707510 containerd[1558]: time="2025-07-09T14:58:50.704381824Z" level=info msg="Container 90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:58:50.854975 kubelet[2817]: I0709 14:58:50.854879 2817 scope.go:117] "RemoveContainer" containerID="e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e" Jul 9 14:58:50.863007 containerd[1558]: time="2025-07-09T14:58:50.862906511Z" level=info msg="CreateContainer within sandbox \"6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57\"" Jul 9 14:58:50.864510 kubelet[2817]: I0709 14:58:50.864410 2817 scope.go:117] "RemoveContainer" containerID="66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d" Jul 9 14:58:50.866433 containerd[1558]: time="2025-07-09T14:58:50.866346248Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 9 14:58:50.867831 containerd[1558]: time="2025-07-09T14:58:50.867782172Z" level=info msg="StartContainer for \"90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57\"" Jul 9 14:58:50.870133 containerd[1558]: time="2025-07-09T14:58:50.870061928Z" level=info msg="connecting to shim 90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57" address="unix:///run/containerd/s/46e42033a9a38a814af42db25f0793ac76a81f0820f7783ad9417eb33e3967f5" protocol=ttrpc version=3 Jul 9 14:58:50.872373 containerd[1558]: time="2025-07-09T14:58:50.872284746Z" level=info msg="CreateContainer within sandbox \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jul 9 14:58:50.939917 systemd[1]: Started cri-containerd-90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57.scope - libcontainer container 90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57. Jul 9 14:58:50.943494 kubelet[2817]: E0709 14:58:50.943104 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.943494 kubelet[2817]: W0709 14:58:50.943151 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.943494 kubelet[2817]: E0709 14:58:50.943178 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.947583 kubelet[2817]: E0709 14:58:50.947530 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.947961 kubelet[2817]: W0709 14:58:50.947701 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.948157 kubelet[2817]: E0709 14:58:50.948040 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.949465 kubelet[2817]: E0709 14:58:50.949262 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.949465 kubelet[2817]: W0709 14:58:50.949286 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.949465 kubelet[2817]: E0709 14:58:50.949308 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.951020 kubelet[2817]: E0709 14:58:50.950816 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.951020 kubelet[2817]: W0709 14:58:50.950838 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.951020 kubelet[2817]: E0709 14:58:50.950856 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.951687 kubelet[2817]: E0709 14:58:50.951606 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.951687 kubelet[2817]: W0709 14:58:50.951626 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.951779 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.952160 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.952220 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.952235 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.953618 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.953686 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.953710 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.954080 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.954096 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.955696 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.957622 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.957645 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.957669 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.960700 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.960721 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.960746 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.960979 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.960989 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961000 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961201 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.961211 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961222 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961380 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.961390 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961416 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961600 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.961610 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961620 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961836 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:50.965668 kubelet[2817]: W0709 14:58:50.961847 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:50.965668 kubelet[2817]: E0709 14:58:50.961863 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.024894 containerd[1558]: time="2025-07-09T14:58:51.024828192Z" level=info msg="Container 4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:58:51.025649 kubelet[2817]: E0709 14:58:51.025614 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.025847 kubelet[2817]: W0709 14:58:51.025773 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.025847 kubelet[2817]: E0709 14:58:51.025803 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.026372 kubelet[2817]: E0709 14:58:51.026300 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.026372 kubelet[2817]: W0709 14:58:51.026319 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.026372 kubelet[2817]: E0709 14:58:51.026335 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.026873 kubelet[2817]: E0709 14:58:51.026853 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.026873 kubelet[2817]: W0709 14:58:51.026904 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.026873 kubelet[2817]: E0709 14:58:51.026921 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.027434 kubelet[2817]: E0709 14:58:51.027386 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.027758 kubelet[2817]: W0709 14:58:51.027578 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.027758 kubelet[2817]: E0709 14:58:51.027598 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.028980 kubelet[2817]: E0709 14:58:51.028907 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.028980 kubelet[2817]: W0709 14:58:51.028925 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.028980 kubelet[2817]: E0709 14:58:51.028940 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.029995 kubelet[2817]: E0709 14:58:51.029952 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.029995 kubelet[2817]: W0709 14:58:51.029966 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.030195 kubelet[2817]: E0709 14:58:51.030123 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.046489 containerd[1558]: time="2025-07-09T14:58:51.045948945Z" level=info msg="Container d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:58:51.119935 containerd[1558]: time="2025-07-09T14:58:51.119887515Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\"" Jul 9 14:58:51.120874 containerd[1558]: time="2025-07-09T14:58:51.120815445Z" level=info msg="StartContainer for \"90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57\" returns successfully" Jul 9 14:58:51.121521 containerd[1558]: time="2025-07-09T14:58:51.121423458Z" level=info msg="StartContainer for \"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\"" Jul 9 14:58:51.123047 containerd[1558]: time="2025-07-09T14:58:51.122884509Z" level=info msg="connecting to shim 4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a" address="unix:///run/containerd/s/2cbdfe341178e08aaffb918245755042099acbdbbc5d0bb9c5629dad14c7d015" protocol=ttrpc version=3 Jul 9 14:58:51.178030 systemd[1]: Started cri-containerd-4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a.scope - libcontainer container 4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a. Jul 9 14:58:51.200883 containerd[1558]: time="2025-07-09T14:58:51.200797448Z" level=info msg="CreateContainer within sandbox \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\"" Jul 9 14:58:51.202754 containerd[1558]: time="2025-07-09T14:58:51.202710908Z" level=info msg="StartContainer for \"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\"" Jul 9 14:58:51.205854 containerd[1558]: time="2025-07-09T14:58:51.205720868Z" level=info msg="connecting to shim d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a" address="unix:///run/containerd/s/373b54ba6f381b4533cf616f65bd7adc560faa2f543308a3604164117a1a93d9" protocol=ttrpc version=3 Jul 9 14:58:51.255861 systemd[1]: Started cri-containerd-d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a.scope - libcontainer container d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a. Jul 9 14:58:51.369797 containerd[1558]: time="2025-07-09T14:58:51.369680493Z" level=info msg="StartContainer for \"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" returns successfully" Jul 9 14:58:51.410853 containerd[1558]: time="2025-07-09T14:58:51.410806623Z" level=info msg="StartContainer for \"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" returns successfully" Jul 9 14:58:51.567414 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3227273764.mount: Deactivated successfully. Jul 9 14:58:51.973044 kubelet[2817]: E0709 14:58:51.973008 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.973044 kubelet[2817]: W0709 14:58:51.973033 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.973044 kubelet[2817]: E0709 14:58:51.973057 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.973811 kubelet[2817]: E0709 14:58:51.973260 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.973811 kubelet[2817]: W0709 14:58:51.973270 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.973811 kubelet[2817]: E0709 14:58:51.973281 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.974716 kubelet[2817]: E0709 14:58:51.974614 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.974716 kubelet[2817]: W0709 14:58:51.974628 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.974716 kubelet[2817]: E0709 14:58:51.974640 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.975345 kubelet[2817]: E0709 14:58:51.975271 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.975345 kubelet[2817]: W0709 14:58:51.975283 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.975345 kubelet[2817]: E0709 14:58:51.975295 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.975637 kubelet[2817]: E0709 14:58:51.975482 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.975637 kubelet[2817]: W0709 14:58:51.975497 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.975637 kubelet[2817]: E0709 14:58:51.975509 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.975783 kubelet[2817]: E0709 14:58:51.975662 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.975783 kubelet[2817]: W0709 14:58:51.975674 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.975783 kubelet[2817]: E0709 14:58:51.975686 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.975899 kubelet[2817]: E0709 14:58:51.975832 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.975899 kubelet[2817]: W0709 14:58:51.975843 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.975899 kubelet[2817]: E0709 14:58:51.975853 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.976364 kubelet[2817]: E0709 14:58:51.976021 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.976364 kubelet[2817]: W0709 14:58:51.976032 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.976364 kubelet[2817]: E0709 14:58:51.976043 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.976364 kubelet[2817]: E0709 14:58:51.976287 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.976364 kubelet[2817]: W0709 14:58:51.976298 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.976364 kubelet[2817]: E0709 14:58:51.976309 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.976963 kubelet[2817]: E0709 14:58:51.976590 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.976963 kubelet[2817]: W0709 14:58:51.976601 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.976963 kubelet[2817]: E0709 14:58:51.976612 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.976963 kubelet[2817]: E0709 14:58:51.976896 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.976963 kubelet[2817]: W0709 14:58:51.976906 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.976963 kubelet[2817]: E0709 14:58:51.976916 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.977713 kubelet[2817]: E0709 14:58:51.977545 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.977713 kubelet[2817]: W0709 14:58:51.977560 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.977713 kubelet[2817]: E0709 14:58:51.977571 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.977713 kubelet[2817]: E0709 14:58:51.977688 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.977713 kubelet[2817]: W0709 14:58:51.977697 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.977713 kubelet[2817]: E0709 14:58:51.977707 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.978326 kubelet[2817]: E0709 14:58:51.977980 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.978326 kubelet[2817]: W0709 14:58:51.977990 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.978326 kubelet[2817]: E0709 14:58:51.978020 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.978326 kubelet[2817]: E0709 14:58:51.978171 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.978326 kubelet[2817]: W0709 14:58:51.978181 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.978326 kubelet[2817]: E0709 14:58:51.978190 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.979087 kubelet[2817]: E0709 14:58:51.978559 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.979087 kubelet[2817]: W0709 14:58:51.978570 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.979087 kubelet[2817]: E0709 14:58:51.978581 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.979087 kubelet[2817]: E0709 14:58:51.978709 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.979087 kubelet[2817]: W0709 14:58:51.978718 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.979087 kubelet[2817]: E0709 14:58:51.978727 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.979087 kubelet[2817]: E0709 14:58:51.978920 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.979087 kubelet[2817]: W0709 14:58:51.978930 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.979087 kubelet[2817]: E0709 14:58:51.978940 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.979601 kubelet[2817]: E0709 14:58:51.979126 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.979601 kubelet[2817]: W0709 14:58:51.979136 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.979601 kubelet[2817]: E0709 14:58:51.979148 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.979601 kubelet[2817]: E0709 14:58:51.979562 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.979601 kubelet[2817]: W0709 14:58:51.979572 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.979601 kubelet[2817]: E0709 14:58:51.979583 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.980162 kubelet[2817]: E0709 14:58:51.979739 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.980162 kubelet[2817]: W0709 14:58:51.979749 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.980162 kubelet[2817]: E0709 14:58:51.979759 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.980497 kubelet[2817]: E0709 14:58:51.980429 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.980497 kubelet[2817]: W0709 14:58:51.980445 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.980497 kubelet[2817]: E0709 14:58:51.980481 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.980980 kubelet[2817]: E0709 14:58:51.980963 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.980980 kubelet[2817]: W0709 14:58:51.980978 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.981234 kubelet[2817]: E0709 14:58:51.980989 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.981234 kubelet[2817]: E0709 14:58:51.981156 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.981234 kubelet[2817]: W0709 14:58:51.981166 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.981234 kubelet[2817]: E0709 14:58:51.981176 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.981560 kubelet[2817]: E0709 14:58:51.981305 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.981560 kubelet[2817]: W0709 14:58:51.981315 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.981560 kubelet[2817]: E0709 14:58:51.981324 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.981929 kubelet[2817]: E0709 14:58:51.981578 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.981929 kubelet[2817]: W0709 14:58:51.981589 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.981929 kubelet[2817]: E0709 14:58:51.981599 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.982332 kubelet[2817]: E0709 14:58:51.982308 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.982332 kubelet[2817]: W0709 14:58:51.982326 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.982569 kubelet[2817]: E0709 14:58:51.982337 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.982569 kubelet[2817]: E0709 14:58:51.982564 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.982569 kubelet[2817]: W0709 14:58:51.982576 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.982569 kubelet[2817]: E0709 14:58:51.982586 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.983331 kubelet[2817]: E0709 14:58:51.983304 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.983331 kubelet[2817]: W0709 14:58:51.983320 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.983331 kubelet[2817]: E0709 14:58:51.983331 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.983496 kubelet[2817]: E0709 14:58:51.983476 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:51.983496 kubelet[2817]: W0709 14:58:51.983489 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:51.983573 kubelet[2817]: E0709 14:58:51.983501 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:51.999237 kubelet[2817]: I0709 14:58:51.999118 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7cf446b48-r6jhs" podStartSLOduration=3.142153849 podStartE2EDuration="38.99907677s" podCreationTimestamp="2025-07-09 14:58:13 +0000 UTC" firstStartedPulling="2025-07-09 14:58:14.692423746 +0000 UTC m=+24.375724457" lastFinishedPulling="2025-07-09 14:58:50.549346677 +0000 UTC m=+60.232647378" observedRunningTime="2025-07-09 14:58:51.979723209 +0000 UTC m=+61.663023940" watchObservedRunningTime="2025-07-09 14:58:51.99907677 +0000 UTC m=+61.682377471" Jul 9 14:58:52.036114 kubelet[2817]: E0709 14:58:52.036088 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.036114 kubelet[2817]: W0709 14:58:52.036112 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.036376 kubelet[2817]: E0709 14:58:52.036132 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.036583 kubelet[2817]: E0709 14:58:52.036568 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.036583 kubelet[2817]: W0709 14:58:52.036582 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.036848 kubelet[2817]: E0709 14:58:52.036600 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.038624 kubelet[2817]: E0709 14:58:52.038586 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.038624 kubelet[2817]: W0709 14:58:52.038605 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.038889 kubelet[2817]: E0709 14:58:52.038768 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.039242 kubelet[2817]: E0709 14:58:52.039208 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.039242 kubelet[2817]: W0709 14:58:52.039223 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.039511 kubelet[2817]: E0709 14:58:52.039387 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.039831 kubelet[2817]: E0709 14:58:52.039800 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.039831 kubelet[2817]: W0709 14:58:52.039814 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.040067 kubelet[2817]: E0709 14:58:52.040048 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.040323 kubelet[2817]: E0709 14:58:52.040276 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.040323 kubelet[2817]: W0709 14:58:52.040293 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.040634 kubelet[2817]: E0709 14:58:52.040544 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.040835 kubelet[2817]: E0709 14:58:52.040822 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.040992 kubelet[2817]: W0709 14:58:52.040918 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.041138 kubelet[2817]: E0709 14:58:52.041076 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.041378 kubelet[2817]: E0709 14:58:52.041365 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.041554 kubelet[2817]: W0709 14:58:52.041492 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.041728 kubelet[2817]: E0709 14:58:52.041647 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.041876 kubelet[2817]: E0709 14:58:52.041848 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.041876 kubelet[2817]: W0709 14:58:52.041860 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.042065 kubelet[2817]: E0709 14:58:52.042051 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.042295 kubelet[2817]: E0709 14:58:52.042267 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.042295 kubelet[2817]: W0709 14:58:52.042280 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.042505 kubelet[2817]: E0709 14:58:52.042490 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.042726 kubelet[2817]: E0709 14:58:52.042696 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.042726 kubelet[2817]: W0709 14:58:52.042710 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.042930 kubelet[2817]: E0709 14:58:52.042900 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.043082 kubelet[2817]: E0709 14:58:52.043055 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.043082 kubelet[2817]: W0709 14:58:52.043068 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.043288 kubelet[2817]: E0709 14:58:52.043260 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.043549 kubelet[2817]: E0709 14:58:52.043521 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.043549 kubelet[2817]: W0709 14:58:52.043534 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.043738 kubelet[2817]: E0709 14:58:52.043682 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.044304 kubelet[2817]: E0709 14:58:52.044206 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.044304 kubelet[2817]: W0709 14:58:52.044221 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.044304 kubelet[2817]: E0709 14:58:52.044236 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.044702 kubelet[2817]: E0709 14:58:52.044672 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.044702 kubelet[2817]: W0709 14:58:52.044686 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.044904 kubelet[2817]: E0709 14:58:52.044889 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.045311 kubelet[2817]: E0709 14:58:52.045295 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.045438 kubelet[2817]: W0709 14:58:52.045406 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.045667 kubelet[2817]: E0709 14:58:52.045651 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.045808 kubelet[2817]: E0709 14:58:52.045782 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.045808 kubelet[2817]: W0709 14:58:52.045794 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.046043 kubelet[2817]: E0709 14:58:52.046023 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.046213 kubelet[2817]: E0709 14:58:52.046185 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.046213 kubelet[2817]: W0709 14:58:52.046198 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.046492 kubelet[2817]: E0709 14:58:52.046415 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.046796 kubelet[2817]: E0709 14:58:52.046757 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.046796 kubelet[2817]: W0709 14:58:52.046775 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.047092 kubelet[2817]: E0709 14:58:52.046993 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.047244 kubelet[2817]: E0709 14:58:52.047215 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.047244 kubelet[2817]: W0709 14:58:52.047230 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.047444 kubelet[2817]: E0709 14:58:52.047423 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.047788 kubelet[2817]: E0709 14:58:52.047758 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.047788 kubelet[2817]: W0709 14:58:52.047773 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.048159 kubelet[2817]: E0709 14:58:52.047898 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.048327 kubelet[2817]: E0709 14:58:52.048289 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.048417 kubelet[2817]: W0709 14:58:52.048402 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.048563 kubelet[2817]: E0709 14:58:52.048548 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.048909 kubelet[2817]: E0709 14:58:52.048888 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.049491 kubelet[2817]: W0709 14:58:52.049017 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.049491 kubelet[2817]: E0709 14:58:52.049037 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.050646 kubelet[2817]: E0709 14:58:52.050631 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.050733 kubelet[2817]: W0709 14:58:52.050719 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.050813 kubelet[2817]: E0709 14:58:52.050799 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.436958 containerd[1558]: time="2025-07-09T14:58:52.436751124Z" level=info msg="TaskExit event container_id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" id:\"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" pid:2651 exit_status:1 exited_at:{seconds:1752073108 nanos:180139561}" Jul 9 14:58:52.453569 containerd[1558]: time="2025-07-09T14:58:52.453429010Z" level=info msg="Ensure that container ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1 in task-service has been cleanup successfully" Jul 9 14:58:52.509503 kubelet[2817]: E0709 14:58:52.508569 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:52.892538 kubelet[2817]: I0709 14:58:52.891686 2817 scope.go:117] "RemoveContainer" containerID="ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" Jul 9 14:58:52.899474 containerd[1558]: time="2025-07-09T14:58:52.898062330Z" level=info msg="CreateContainer within sandbox \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 9 14:58:52.959339 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2272149697.mount: Deactivated successfully. Jul 9 14:58:52.970533 containerd[1558]: time="2025-07-09T14:58:52.968379435Z" level=info msg="Container 59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:58:52.972846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4179162953.mount: Deactivated successfully. Jul 9 14:58:52.993604 kubelet[2817]: E0709 14:58:52.993566 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.993604 kubelet[2817]: W0709 14:58:52.993597 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.994048 kubelet[2817]: E0709 14:58:52.993649 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.994048 kubelet[2817]: E0709 14:58:52.993861 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.994048 kubelet[2817]: W0709 14:58:52.993871 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.994048 kubelet[2817]: E0709 14:58:52.993881 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.994178 kubelet[2817]: E0709 14:58:52.994088 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.994178 kubelet[2817]: W0709 14:58:52.994099 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.994178 kubelet[2817]: E0709 14:58:52.994108 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.995280 kubelet[2817]: E0709 14:58:52.994639 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.995280 kubelet[2817]: W0709 14:58:52.994650 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.995280 kubelet[2817]: E0709 14:58:52.994660 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.995280 kubelet[2817]: E0709 14:58:52.994862 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.995280 kubelet[2817]: W0709 14:58:52.994872 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.995280 kubelet[2817]: E0709 14:58:52.994881 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.995280 kubelet[2817]: E0709 14:58:52.995110 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.995280 kubelet[2817]: W0709 14:58:52.995120 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.995280 kubelet[2817]: E0709 14:58:52.995132 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.995591 kubelet[2817]: E0709 14:58:52.995481 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.995591 kubelet[2817]: W0709 14:58:52.995492 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.995591 kubelet[2817]: E0709 14:58:52.995502 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.995874 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.997467 kubelet[2817]: W0709 14:58:52.995891 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.995903 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.996603 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.997467 kubelet[2817]: W0709 14:58:52.996618 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.996630 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.996877 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.997467 kubelet[2817]: W0709 14:58:52.996888 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.996901 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.997265 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.997467 kubelet[2817]: W0709 14:58:52.997277 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.997467 kubelet[2817]: E0709 14:58:52.997288 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.998130 kubelet[2817]: E0709 14:58:52.998108 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.998130 kubelet[2817]: W0709 14:58:52.998124 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.998213 kubelet[2817]: E0709 14:58:52.998136 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.998364 kubelet[2817]: E0709 14:58:52.998345 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.998364 kubelet[2817]: W0709 14:58:52.998360 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.998442 kubelet[2817]: E0709 14:58:52.998370 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.998661 kubelet[2817]: E0709 14:58:52.998640 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.998661 kubelet[2817]: W0709 14:58:52.998656 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.998737 kubelet[2817]: E0709 14:58:52.998666 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.998910 kubelet[2817]: E0709 14:58:52.998891 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.998910 kubelet[2817]: W0709 14:58:52.998906 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.998995 kubelet[2817]: E0709 14:58:52.998916 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.999138 kubelet[2817]: E0709 14:58:52.999119 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.999138 kubelet[2817]: W0709 14:58:52.999133 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.999207 kubelet[2817]: E0709 14:58:52.999144 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:52.999396 kubelet[2817]: E0709 14:58:52.999377 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:52.999396 kubelet[2817]: W0709 14:58:52.999392 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:52.999497 kubelet[2817]: E0709 14:58:52.999403 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.000802 kubelet[2817]: E0709 14:58:53.000689 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.000802 kubelet[2817]: W0709 14:58:53.000707 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.000802 kubelet[2817]: E0709 14:58:53.000718 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.001629 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.003464 kubelet[2817]: W0709 14:58:53.001645 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.001655 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.001792 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.003464 kubelet[2817]: W0709 14:58:53.001805 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.001814 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.001928 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.003464 kubelet[2817]: W0709 14:58:53.001937 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.002493 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.002648 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.003464 kubelet[2817]: W0709 14:58:53.002658 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.002667 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.002884 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.003464 kubelet[2817]: W0709 14:58:53.002893 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.002904 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.003233 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.003464 kubelet[2817]: W0709 14:58:53.003243 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.003464 kubelet[2817]: E0709 14:58:53.003253 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.004230 kubelet[2817]: E0709 14:58:53.003538 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.004230 kubelet[2817]: W0709 14:58:53.003549 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.004230 kubelet[2817]: E0709 14:58:53.003559 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.004230 kubelet[2817]: E0709 14:58:53.003929 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.004230 kubelet[2817]: W0709 14:58:53.003940 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.004230 kubelet[2817]: E0709 14:58:53.003951 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.004416 kubelet[2817]: E0709 14:58:53.004358 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.004416 kubelet[2817]: W0709 14:58:53.004369 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.004416 kubelet[2817]: E0709 14:58:53.004381 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.006586 kubelet[2817]: E0709 14:58:53.006565 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.006586 kubelet[2817]: W0709 14:58:53.006583 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.006665 kubelet[2817]: E0709 14:58:53.006595 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.006927 kubelet[2817]: E0709 14:58:53.006908 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.006927 kubelet[2817]: W0709 14:58:53.006926 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.007029 kubelet[2817]: E0709 14:58:53.006937 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.008199 kubelet[2817]: E0709 14:58:53.008177 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.008199 kubelet[2817]: W0709 14:58:53.008194 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.008281 kubelet[2817]: E0709 14:58:53.008207 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.008642 kubelet[2817]: E0709 14:58:53.008621 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.008642 kubelet[2817]: W0709 14:58:53.008638 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.008741 kubelet[2817]: E0709 14:58:53.008652 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.009934 kubelet[2817]: E0709 14:58:53.009160 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.009934 kubelet[2817]: W0709 14:58:53.009176 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.009934 kubelet[2817]: E0709 14:58:53.009187 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.010055 containerd[1558]: time="2025-07-09T14:58:53.009437802Z" level=info msg="CreateContainer within sandbox \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\"" Jul 9 14:58:53.010203 kubelet[2817]: E0709 14:58:53.010184 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.010203 kubelet[2817]: W0709 14:58:53.010199 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.010288 kubelet[2817]: E0709 14:58:53.010210 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.010354 containerd[1558]: time="2025-07-09T14:58:53.010326937Z" level=info msg="StartContainer for \"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\"" Jul 9 14:58:53.011408 kubelet[2817]: E0709 14:58:53.010547 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.011408 kubelet[2817]: W0709 14:58:53.010564 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.011408 kubelet[2817]: E0709 14:58:53.010574 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.011408 kubelet[2817]: E0709 14:58:53.010789 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.011408 kubelet[2817]: W0709 14:58:53.010799 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.011408 kubelet[2817]: E0709 14:58:53.010810 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.011408 kubelet[2817]: E0709 14:58:53.011045 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.011408 kubelet[2817]: W0709 14:58:53.011055 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.011408 kubelet[2817]: E0709 14:58:53.011065 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.011839 kubelet[2817]: E0709 14:58:53.011493 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.011839 kubelet[2817]: W0709 14:58:53.011503 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.011839 kubelet[2817]: E0709 14:58:53.011514 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.011944 kubelet[2817]: E0709 14:58:53.011853 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.011944 kubelet[2817]: W0709 14:58:53.011863 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.011944 kubelet[2817]: E0709 14:58:53.011875 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.012145 kubelet[2817]: E0709 14:58:53.012124 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.012145 kubelet[2817]: W0709 14:58:53.012141 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.012230 kubelet[2817]: E0709 14:58:53.012153 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.012849 kubelet[2817]: E0709 14:58:53.012689 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.012849 kubelet[2817]: W0709 14:58:53.012708 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.012849 kubelet[2817]: E0709 14:58:53.012729 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.016183 containerd[1558]: time="2025-07-09T14:58:53.016105974Z" level=info msg="connecting to shim 59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b" address="unix:///run/containerd/s/6dcc86629619ae9c68f0b9972b137ba0897f120b3d1d0192184fc1f2c28250cd" protocol=ttrpc version=3 Jul 9 14:58:53.055327 kubelet[2817]: E0709 14:58:53.055284 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.055327 kubelet[2817]: W0709 14:58:53.055309 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.055495 kubelet[2817]: E0709 14:58:53.055342 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.055562 kubelet[2817]: E0709 14:58:53.055535 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.055562 kubelet[2817]: W0709 14:58:53.055545 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.055924 kubelet[2817]: E0709 14:58:53.055804 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.056201 kubelet[2817]: E0709 14:58:53.056091 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.056201 kubelet[2817]: W0709 14:58:53.056107 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.056201 kubelet[2817]: E0709 14:58:53.056185 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.056531 kubelet[2817]: E0709 14:58:53.056511 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.056531 kubelet[2817]: W0709 14:58:53.056528 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.057400 kubelet[2817]: E0709 14:58:53.057362 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.057805 kubelet[2817]: E0709 14:58:53.057635 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.057805 kubelet[2817]: W0709 14:58:53.057650 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.057919 kubelet[2817]: E0709 14:58:53.057899 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.057919 kubelet[2817]: W0709 14:58:53.057914 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.057995 kubelet[2817]: E0709 14:58:53.057926 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.058100 kubelet[2817]: E0709 14:58:53.057671 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.058730 kubelet[2817]: E0709 14:58:53.058321 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.058730 kubelet[2817]: W0709 14:58:53.058337 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.058730 kubelet[2817]: E0709 14:58:53.058482 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.058867 kubelet[2817]: E0709 14:58:53.058762 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.058867 kubelet[2817]: W0709 14:58:53.058773 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.058867 kubelet[2817]: E0709 14:58:53.058826 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.059661 kubelet[2817]: E0709 14:58:53.059529 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.059661 kubelet[2817]: W0709 14:58:53.059545 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.059661 kubelet[2817]: E0709 14:58:53.059612 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.061469 kubelet[2817]: E0709 14:58:53.060438 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.061469 kubelet[2817]: W0709 14:58:53.060493 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.061469 kubelet[2817]: E0709 14:58:53.060575 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.061469 kubelet[2817]: E0709 14:58:53.061285 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.061469 kubelet[2817]: W0709 14:58:53.061295 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.061652 kubelet[2817]: E0709 14:58:53.061483 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.061652 kubelet[2817]: E0709 14:58:53.061622 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.061652 kubelet[2817]: W0709 14:58:53.061632 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.061791 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.062031 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.063468 kubelet[2817]: W0709 14:58:53.062046 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.062070 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.062596 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.063468 kubelet[2817]: W0709 14:58:53.062612 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.062653 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.063108 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.063468 kubelet[2817]: W0709 14:58:53.063121 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.063468 kubelet[2817]: E0709 14:58:53.063134 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.063806 kubelet[2817]: E0709 14:58:53.063673 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.063806 kubelet[2817]: W0709 14:58:53.063686 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.063806 kubelet[2817]: E0709 14:58:53.063728 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.065394 kubelet[2817]: E0709 14:58:53.064777 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.065394 kubelet[2817]: W0709 14:58:53.064806 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.065394 kubelet[2817]: E0709 14:58:53.064823 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.070475 kubelet[2817]: E0709 14:58:53.068516 2817 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 9 14:58:53.070475 kubelet[2817]: W0709 14:58:53.068536 2817 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 9 14:58:53.070475 kubelet[2817]: E0709 14:58:53.068550 2817 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 9 14:58:53.150880 systemd[1]: Started cri-containerd-59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b.scope - libcontainer container 59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b. Jul 9 14:58:53.299951 containerd[1558]: time="2025-07-09T14:58:53.299804539Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:58:53.302733 containerd[1558]: time="2025-07-09T14:58:53.302655715Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 9 14:58:53.304480 containerd[1558]: time="2025-07-09T14:58:53.304409931Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:58:53.329686 containerd[1558]: time="2025-07-09T14:58:53.329627640Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:58:53.337479 containerd[1558]: time="2025-07-09T14:58:53.335859494Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 2.784829485s" Jul 9 14:58:53.337479 containerd[1558]: time="2025-07-09T14:58:53.335923025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 9 14:58:53.343859 containerd[1558]: time="2025-07-09T14:58:53.343048494Z" level=info msg="StartContainer for \"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" returns successfully" Jul 9 14:58:53.347488 containerd[1558]: time="2025-07-09T14:58:53.347062754Z" level=info msg="CreateContainer within sandbox \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 9 14:58:53.378760 containerd[1558]: time="2025-07-09T14:58:53.378704433Z" level=info msg="Container af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:58:53.408814 containerd[1558]: time="2025-07-09T14:58:53.408374934Z" level=info msg="CreateContainer within sandbox \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\"" Jul 9 14:58:53.410199 containerd[1558]: time="2025-07-09T14:58:53.410097379Z" level=info msg="StartContainer for \"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\"" Jul 9 14:58:53.414389 containerd[1558]: time="2025-07-09T14:58:53.414351635Z" level=info msg="connecting to shim af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2" address="unix:///run/containerd/s/c728b76101f95f4b4b808e74a0222358b809720afc928cd20dc9dfddd726edef" protocol=ttrpc version=3 Jul 9 14:58:53.457807 systemd[1]: Started cri-containerd-af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2.scope - libcontainer container af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2. Jul 9 14:58:53.970420 containerd[1558]: time="2025-07-09T14:58:53.969615061Z" level=info msg="StartContainer for \"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\" returns successfully" Jul 9 14:58:54.022632 systemd[1]: cri-containerd-af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2.scope: Deactivated successfully. Jul 9 14:58:54.041916 containerd[1558]: time="2025-07-09T14:58:54.041861191Z" level=info msg="received exit event container_id:\"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\" id:\"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\" pid:3751 exited_at:{seconds:1752073134 nanos:39776820}" Jul 9 14:58:54.043952 containerd[1558]: time="2025-07-09T14:58:54.043912760Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\" id:\"af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\" pid:3751 exited_at:{seconds:1752073134 nanos:39776820}" Jul 9 14:58:54.118904 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2-rootfs.mount: Deactivated successfully. Jul 9 14:58:54.508941 kubelet[2817]: E0709 14:58:54.508432 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:54.989482 containerd[1558]: time="2025-07-09T14:58:54.988522836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 9 14:58:56.509269 kubelet[2817]: E0709 14:58:56.508326 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:58:58.512490 kubelet[2817]: E0709 14:58:58.509941 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:00.515002 kubelet[2817]: E0709 14:59:00.512962 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:02.520845 kubelet[2817]: E0709 14:59:02.520103 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:04.508419 kubelet[2817]: E0709 14:59:04.508336 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:11.627255 kubelet[2817]: E0709 14:59:06.507974 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:11.627255 kubelet[2817]: E0709 14:59:07.531261 2817 event.go:359] "Server rejected event (will not retry!)" err="etcdserver: request timed out" event="&Event{ObjectMeta:{csi-node-driver-sbhtt.18509d3254b00395 calico-system 836 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:csi-node-driver-sbhtt,UID:2052df22-65ee-4914-9e6a-b1a620327c58,APIVersion:v1,ResourceVersion:719,FieldPath:,},Reason:NetworkNotReady,Message:network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized,Source:EventSource{Component:kubelet,Host:ci-9999-9-100-bf645a1a30.novalocal,},FirstTimestamp:2025-07-09 14:58:14 +0000 UTC,LastTimestamp:2025-07-09 14:59:00.512916476 +0000 UTC m=+70.196217247,Count:24,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-9-100-bf645a1a30.novalocal,}" Jul 9 14:59:11.627255 kubelet[2817]: E0709 14:59:08.509253 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:11.627255 kubelet[2817]: E0709 14:59:10.513115 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:12.510113 kubelet[2817]: E0709 14:59:12.509866 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:14.509520 kubelet[2817]: E0709 14:59:14.508630 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:16.509825 kubelet[2817]: E0709 14:59:16.509628 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:17.220926 containerd[1558]: time="2025-07-09T14:59:17.220343989Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:59:17.238411 containerd[1558]: time="2025-07-09T14:59:17.238329314Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 9 14:59:17.259661 containerd[1558]: time="2025-07-09T14:59:17.259573090Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:59:17.276651 containerd[1558]: time="2025-07-09T14:59:17.276591380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 14:59:17.279471 containerd[1558]: time="2025-07-09T14:59:17.279389112Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 22.29079928s" Jul 9 14:59:17.280579 containerd[1558]: time="2025-07-09T14:59:17.280518945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 9 14:59:17.312627 containerd[1558]: time="2025-07-09T14:59:17.312441593Z" level=info msg="CreateContainer within sandbox \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 9 14:59:17.402709 containerd[1558]: time="2025-07-09T14:59:17.402661832Z" level=info msg="Container ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:59:17.546188 containerd[1558]: time="2025-07-09T14:59:17.544427695Z" level=info msg="CreateContainer within sandbox \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\"" Jul 9 14:59:17.550319 containerd[1558]: time="2025-07-09T14:59:17.550244057Z" level=info msg="StartContainer for \"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\"" Jul 9 14:59:17.558026 containerd[1558]: time="2025-07-09T14:59:17.557976374Z" level=info msg="connecting to shim ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15" address="unix:///run/containerd/s/c728b76101f95f4b4b808e74a0222358b809720afc928cd20dc9dfddd726edef" protocol=ttrpc version=3 Jul 9 14:59:17.655950 systemd[1]: Started cri-containerd-ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15.scope - libcontainer container ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15. Jul 9 14:59:17.799732 containerd[1558]: time="2025-07-09T14:59:17.799485186Z" level=info msg="StartContainer for \"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\" returns successfully" Jul 9 14:59:18.508820 kubelet[2817]: E0709 14:59:18.508517 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:20.508053 kubelet[2817]: E0709 14:59:20.507912 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:22.509932 kubelet[2817]: E0709 14:59:22.509711 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:24.509518 kubelet[2817]: E0709 14:59:24.508271 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.884504 kubelet[2817]: I0709 14:59:25.297717 2817 status_manager.go:875] "Failed to update status for pod" pod="calico-system/calico-node-4mm6r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d23b64-d4b5-412f-b96c-ed79214183e6\\\"},\\\"status\\\":{\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"containerd://af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\\\",\\\"image\\\":\\\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\\\",\\\"imageID\\\":\\\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"flexvol-driver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"containerd://af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-07-09T14:58:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-07-09T14:58:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/driver\\\",\\\"name\\\":\\\"flexvol-driver-host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlcdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"containerd://ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\\\",\\\"image\\\":\\\"ghcr.io/flatcar/calico/cni:v3.30.2\\\",\\\"imageID\\\":\\\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"install-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-07-09T14:59:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cni-bin-dir\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"cni-net-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tlcdt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"calico-system\"/\"calico-node-4mm6r\": etcdserver: request timed out" Jul 9 14:59:42.884504 kubelet[2817]: E0709 14:59:26.508844 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.884504 kubelet[2817]: E0709 14:59:28.508349 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.884504 kubelet[2817]: E0709 14:59:30.509133 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.884504 kubelet[2817]: E0709 14:59:31.757931 2817 controller.go:195] "Failed to update lease" err="Put \"https://172.24.4.222:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-9-100-bf645a1a30.novalocal?timeout=10s\": context deadline exceeded" Jul 9 14:59:42.884504 kubelet[2817]: E0709 14:59:32.509983 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.912122 containerd[1558]: time="2025-07-09T14:59:28.713687748Z" level=info msg="received exit event container_id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" pid:3723 exit_status:1 exited_at:{seconds:1752073168 nanos:710558657}" Jul 9 14:59:42.912122 containerd[1558]: time="2025-07-09T14:59:28.714077142Z" level=info msg="TaskExit event in podsandbox handler container_id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" pid:3723 exit_status:1 exited_at:{seconds:1752073168 nanos:710558657}" Jul 9 14:59:42.912122 containerd[1558]: time="2025-07-09T14:59:29.029154406Z" level=info msg="received exit event container_id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" pid:3537 exit_status:1 exited_at:{seconds:1752073169 nanos:27500807}" Jul 9 14:59:42.912122 containerd[1558]: time="2025-07-09T14:59:29.033098162Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" pid:3537 exit_status:1 exited_at:{seconds:1752073169 nanos:27500807}" Jul 9 14:59:42.912122 containerd[1558]: time="2025-07-09T14:59:33.651969538Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" pid:3559 exit_status:1 exited_at:{seconds:1752073173 nanos:647055825}" Jul 9 14:59:42.912122 containerd[1558]: time="2025-07-09T14:59:33.652350897Z" level=info msg="received exit event container_id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" pid:3559 exit_status:1 exited_at:{seconds:1752073173 nanos:647055825}" Jul 9 14:59:28.703140 systemd[1]: cri-containerd-59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b.scope: Deactivated successfully. Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:32.736740 2817 kubelet_node_status.go:535] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:59:22Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:59:22Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:59:22Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-07-09T14:59:22Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\\\",\\\"ghcr.io/flatcar/calico/cni:v3.30.2\\\"],\\\"sizeBytes\\\":71928924},{\\\"names\\\":[\\\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\\\",\\\"registry.k8s.io/etcd:3.5.15-0\\\"],\\\"sizeBytes\\\":56909194},{\\\"names\\\":[\\\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\\\",\\\"ghcr.io/flatcar/calico/typha:v3.30.2\\\"],\\\"sizeBytes\\\":35233218},{\\\"names\\\":[\\\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\\\",\\\"registry.k8s.io/kube-proxy:v1.31.10\\\"],\\\"sizeBytes\\\":30382962},{\\\"names\\\":[\\\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\\\",\\\"registry.k8s.io/kube-apiserver:v1.31.10\\\"],\\\"sizeBytes\\\":28074544},{\\\"names\\\":[\\\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\\\",\\\"registry.k8s.io/kube-controller-manager:v1.31.10\\\"],\\\"sizeBytes\\\":26315128},{\\\"names\\\":[\\\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\\\",\\\"quay.io/tigera/operator:v1.38.3\\\"],\\\"sizeBytes\\\":25052538},{\\\"names\\\":[\\\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\\\",\\\"registry.k8s.io/kube-scheduler:v1.31.10\\\"],\\\"sizeBytes\\\":20385523},{\\\"names\\\":[\\\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\\\",\\\"registry.k8s.io/coredns/coredns:v1.11.3\\\"],\\\"sizeBytes\\\":18562039},{\\\"names\\\":[\\\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\\\",\\\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\\\"],\\\"sizeBytes\\\":5939619},{\\\"names\\\":[\\\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\\\",\\\"registry.k8s.io/pause:3.10\\\"],\\\"sizeBytes\\\":320368}]}}\" for node \"ci-9999-9-100-bf645a1a30.novalocal\": etcdserver: request timed out" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:32.745717 2817 event.go:359] "Server rejected event (will not retry!)" err="etcdserver: request timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal.18509d341b6f51b2 kube-system 869 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-9999-9-100-bf645a1a30.novalocal,UID:422dcf2db8ddfe2950af52e03de96b6e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-9999-9-100-bf645a1a30.novalocal,},FirstTimestamp:2025-07-09 14:58:22 +0000 UTC,LastTimestamp:2025-07-09 14:59:24.174155481 +0000 UTC m=+93.857456232,Count:12,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-9-100-bf645a1a30.novalocal,}" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:34.508040 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:36.509685 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:38.509409 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:39.735599 2817 controller.go:195] "Failed to update lease" err="etcdserver: request timed out" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:40.508569 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.926066 kubelet[2817]: I0709 14:59:42.267475 2817 status_manager.go:851] "Failed to get status for pod" podUID="36d23b64-d4b5-412f-b96c-ed79214183e6" pod="calico-system/calico-node-4mm6r" err="etcdserver: request timed out" Jul 9 14:59:42.926066 kubelet[2817]: E0709 14:59:42.508285 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:42.932423 containerd[1558]: time="2025-07-09T14:59:42.907972444Z" level=error msg="failed to handle container TaskExit event container_id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" pid:3723 exit_status:1 exited_at:{seconds:1752073168 nanos:710558657}" error="failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:59:28.705651 systemd[1]: cri-containerd-59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b.scope: Consumed 2.302s CPU time, 44.3M memory peak. Jul 9 14:59:42.934423 kubelet[2817]: E0709 14:59:42.738226 2817 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-9999-9-100-bf645a1a30.novalocal\": Get \"https://172.24.4.222:6443/api/v1/nodes/ci-9999-9-100-bf645a1a30.novalocal?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jul 9 14:59:29.026065 systemd[1]: cri-containerd-4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a.scope: Deactivated successfully. Jul 9 14:59:29.029030 systemd[1]: cri-containerd-4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a.scope: Consumed 1.320s CPU time, 52.9M memory peak. Jul 9 14:59:33.637602 systemd[1]: cri-containerd-d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a.scope: Deactivated successfully. Jul 9 14:59:33.642139 systemd[1]: cri-containerd-d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a.scope: Consumed 2.079s CPU time, 17.8M memory peak. Jul 9 14:59:42.939953 containerd[1558]: time="2025-07-09T14:59:42.939826110Z" level=error msg="failed to handle container TaskExit event container_id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" pid:3537 exit_status:1 exited_at:{seconds:1752073169 nanos:27500807}" error="failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:59:42.971906 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a-rootfs.mount: Deactivated successfully. Jul 9 14:59:42.980507 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b-rootfs.mount: Deactivated successfully. Jul 9 14:59:42.988210 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a-rootfs.mount: Deactivated successfully. Jul 9 14:59:43.369538 kubelet[2817]: E0709 14:59:43.369482 2817 controller.go:195] "Failed to update lease" err="Operation cannot be fulfilled on leases.coordination.k8s.io \"ci-9999-9-100-bf645a1a30.novalocal\": the object has been modified; please apply your changes to the latest version and try again" Jul 9 14:59:43.654031 containerd[1558]: time="2025-07-09T14:59:43.653547603Z" level=error msg="failed to handle container TaskExit event container_id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" pid:3559 exit_status:1 exited_at:{seconds:1752073173 nanos:647055825}" error="failed to stop container: failed to delete task: context deadline exceeded" Jul 9 14:59:44.436436 containerd[1558]: time="2025-07-09T14:59:44.436332535Z" level=info msg="TaskExit event container_id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" id:\"59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b\" pid:3723 exit_status:1 exited_at:{seconds:1752073168 nanos:710558657}" Jul 9 14:59:44.508021 kubelet[2817]: E0709 14:59:44.507903 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:45.534853 containerd[1558]: time="2025-07-09T14:59:45.534691314Z" level=error msg="ttrpc: received message on inactive stream" stream=39 Jul 9 14:59:45.601518 containerd[1558]: time="2025-07-09T14:59:45.601164108Z" level=error msg="ttrpc: received message on inactive stream" stream=37 Jul 9 14:59:45.603016 containerd[1558]: time="2025-07-09T14:59:45.602897814Z" level=error msg="ttrpc: received message on inactive stream" stream=39 Jul 9 14:59:45.945900 containerd[1558]: time="2025-07-09T14:59:45.945697182Z" level=info msg="TaskExit event container_id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" id:\"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" pid:3537 exit_status:1 exited_at:{seconds:1752073169 nanos:27500807}" Jul 9 14:59:46.362378 containerd[1558]: time="2025-07-09T14:59:46.362156700Z" level=info msg="TaskExit event container_id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" id:\"d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a\" pid:3559 exit_status:1 exited_at:{seconds:1752073173 nanos:647055825}" Jul 9 14:59:46.385253 containerd[1558]: time="2025-07-09T14:59:46.384724982Z" level=info msg="Ensure that container d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a in task-service has been cleanup successfully" Jul 9 14:59:46.507990 kubelet[2817]: E0709 14:59:46.507888 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:46.675035 kubelet[2817]: I0709 14:59:46.673010 2817 scope.go:117] "RemoveContainer" containerID="66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d" Jul 9 14:59:46.675035 kubelet[2817]: I0709 14:59:46.674082 2817 scope.go:117] "RemoveContainer" containerID="d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a" Jul 9 14:59:46.685249 containerd[1558]: time="2025-07-09T14:59:46.685164263Z" level=info msg="CreateContainer within sandbox \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:2,}" Jul 9 14:59:46.702063 kubelet[2817]: I0709 14:59:46.701655 2817 scope.go:117] "RemoveContainer" containerID="59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b" Jul 9 14:59:46.704172 containerd[1558]: time="2025-07-09T14:59:46.702926173Z" level=info msg="RemoveContainer for \"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\"" Jul 9 14:59:46.711855 containerd[1558]: time="2025-07-09T14:59:46.711783898Z" level=info msg="CreateContainer within sandbox \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:2,}" Jul 9 14:59:46.718045 kubelet[2817]: I0709 14:59:46.718014 2817 scope.go:117] "RemoveContainer" containerID="4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a" Jul 9 14:59:46.724374 containerd[1558]: time="2025-07-09T14:59:46.724165101Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for container &ContainerMetadata{Name:tigera-operator,Attempt:2,}" Jul 9 14:59:47.096676 containerd[1558]: time="2025-07-09T14:59:47.095670254Z" level=info msg="RemoveContainer for \"66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d\" returns successfully" Jul 9 14:59:47.098006 kubelet[2817]: I0709 14:59:47.097758 2817 scope.go:117] "RemoveContainer" containerID="ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1" Jul 9 14:59:47.107518 containerd[1558]: time="2025-07-09T14:59:47.106923110Z" level=info msg="RemoveContainer for \"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\"" Jul 9 14:59:47.231852 containerd[1558]: time="2025-07-09T14:59:47.231570544Z" level=info msg="Container 57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:59:47.232782 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2142542711.mount: Deactivated successfully. Jul 9 14:59:47.237856 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2025768862.mount: Deactivated successfully. Jul 9 14:59:47.562203 systemd[1]: cri-containerd-ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15.scope: Deactivated successfully. Jul 9 14:59:47.566080 systemd[1]: cri-containerd-ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15.scope: Consumed 3.022s CPU time, 190.8M memory peak, 171.2M written to disk. Jul 9 14:59:47.576011 containerd[1558]: time="2025-07-09T14:59:47.575936301Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\" id:\"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\" pid:3809 exited_at:{seconds:1752073187 nanos:574279802}" Jul 9 14:59:47.576801 containerd[1558]: time="2025-07-09T14:59:47.576743803Z" level=info msg="received exit event container_id:\"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\" id:\"ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15\" pid:3809 exited_at:{seconds:1752073187 nanos:574279802}" Jul 9 14:59:47.594493 kubelet[2817]: I0709 14:59:47.592835 2817 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 9 14:59:47.639553 containerd[1558]: time="2025-07-09T14:59:47.639111813Z" level=info msg="RemoveContainer for \"ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1\" returns successfully" Jul 9 14:59:47.640010 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15-rootfs.mount: Deactivated successfully. Jul 9 14:59:47.642190 kubelet[2817]: I0709 14:59:47.642115 2817 scope.go:117] "RemoveContainer" containerID="e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e" Jul 9 14:59:47.679533 containerd[1558]: time="2025-07-09T14:59:47.679265970Z" level=info msg="RemoveContainer for \"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\"" Jul 9 14:59:47.758248 containerd[1558]: time="2025-07-09T14:59:47.758141344Z" level=info msg="Container 9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:59:47.795316 containerd[1558]: time="2025-07-09T14:59:47.795233393Z" level=info msg="Container 0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4: CDI devices from CRI Config.CDIDevices: []" Jul 9 14:59:47.985693 containerd[1558]: time="2025-07-09T14:59:47.985402706Z" level=info msg="CreateContainer within sandbox \"96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39\" for &ContainerMetadata{Name:kube-scheduler,Attempt:2,} returns container id \"57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456\"" Jul 9 14:59:47.991509 containerd[1558]: time="2025-07-09T14:59:47.991372201Z" level=info msg="StartContainer for \"57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456\"" Jul 9 14:59:48.015043 containerd[1558]: time="2025-07-09T14:59:48.014923302Z" level=info msg="connecting to shim 57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456" address="unix:///run/containerd/s/373b54ba6f381b4533cf616f65bd7adc560faa2f543308a3604164117a1a93d9" protocol=ttrpc version=3 Jul 9 14:59:48.100848 systemd[1]: Started cri-containerd-57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456.scope - libcontainer container 57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456. Jul 9 14:59:48.229090 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3162368140.mount: Deactivated successfully. Jul 9 14:59:48.239482 containerd[1558]: time="2025-07-09T14:59:48.238483344Z" level=info msg="CreateContainer within sandbox \"e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:2,} returns container id \"9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d\"" Jul 9 14:59:48.242012 containerd[1558]: time="2025-07-09T14:59:48.241888068Z" level=info msg="StartContainer for \"9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d\"" Jul 9 14:59:48.263721 containerd[1558]: time="2025-07-09T14:59:48.263667340Z" level=info msg="connecting to shim 9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d" address="unix:///run/containerd/s/6dcc86629619ae9c68f0b9972b137ba0897f120b3d1d0192184fc1f2c28250cd" protocol=ttrpc version=3 Jul 9 14:59:48.304129 systemd[1]: Started cri-containerd-9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d.scope - libcontainer container 9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d. Jul 9 14:59:48.373388 containerd[1558]: time="2025-07-09T14:59:48.373131113Z" level=info msg="RemoveContainer for \"e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e\" returns successfully" Jul 9 14:59:48.378897 containerd[1558]: time="2025-07-09T14:59:48.378713258Z" level=info msg="StartContainer for \"57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456\" returns successfully" Jul 9 14:59:48.379685 containerd[1558]: time="2025-07-09T14:59:48.379577907Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for &ContainerMetadata{Name:tigera-operator,Attempt:2,} returns container id \"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\"" Jul 9 14:59:48.380260 containerd[1558]: time="2025-07-09T14:59:48.380237930Z" level=info msg="StartContainer for \"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\"" Jul 9 14:59:48.383937 containerd[1558]: time="2025-07-09T14:59:48.383880110Z" level=info msg="connecting to shim 0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4" address="unix:///run/containerd/s/2cbdfe341178e08aaffb918245755042099acbdbbc5d0bb9c5629dad14c7d015" protocol=ttrpc version=3 Jul 9 14:59:48.444671 systemd[1]: Started cri-containerd-0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4.scope - libcontainer container 0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4. Jul 9 14:59:48.529504 systemd[1]: Created slice kubepods-besteffort-pod2052df22_65ee_4914_9e6a_b1a620327c58.slice - libcontainer container kubepods-besteffort-pod2052df22_65ee_4914_9e6a_b1a620327c58.slice. Jul 9 14:59:48.538111 containerd[1558]: time="2025-07-09T14:59:48.538058707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,}" Jul 9 14:59:48.565357 containerd[1558]: time="2025-07-09T14:59:48.563047235Z" level=info msg="StartContainer for \"9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d\" returns successfully" Jul 9 14:59:48.648138 containerd[1558]: time="2025-07-09T14:59:48.648095569Z" level=info msg="StartContainer for \"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\" returns successfully" Jul 9 14:59:48.762707 containerd[1558]: time="2025-07-09T14:59:48.762661102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 9 14:59:48.839983 containerd[1558]: time="2025-07-09T14:59:48.839821960Z" level=error msg="Failed to destroy network for sandbox \"18eca232f004af8ec196a7bf506e4d2cbddf1dcc14bf2fb73ed17205c6d1aa0f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 14:59:48.845595 systemd[1]: run-netns-cni\x2d767e35c2\x2d072e\x2d2ce0\x2dc56b\x2d82cdb0e146e0.mount: Deactivated successfully. Jul 9 14:59:48.867861 containerd[1558]: time="2025-07-09T14:59:48.867783448Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"18eca232f004af8ec196a7bf506e4d2cbddf1dcc14bf2fb73ed17205c6d1aa0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 14:59:48.868759 kubelet[2817]: E0709 14:59:48.868670 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18eca232f004af8ec196a7bf506e4d2cbddf1dcc14bf2fb73ed17205c6d1aa0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 14:59:48.869197 kubelet[2817]: E0709 14:59:48.868844 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18eca232f004af8ec196a7bf506e4d2cbddf1dcc14bf2fb73ed17205c6d1aa0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:59:48.869197 kubelet[2817]: E0709 14:59:48.868896 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18eca232f004af8ec196a7bf506e4d2cbddf1dcc14bf2fb73ed17205c6d1aa0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sbhtt" Jul 9 14:59:48.869197 kubelet[2817]: E0709 14:59:48.868985 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-sbhtt_calico-system(2052df22-65ee-4914-9e6a-b1a620327c58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-sbhtt_calico-system(2052df22-65ee-4914-9e6a-b1a620327c58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"18eca232f004af8ec196a7bf506e4d2cbddf1dcc14bf2fb73ed17205c6d1aa0f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 14:59:59.510481 containerd[1558]: time="2025-07-09T14:59:59.509784549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:03.348522 containerd[1558]: time="2025-07-09T15:00:03.344168532Z" level=error msg="Failed to destroy network for sandbox \"5b307366355c5805ef438c033fe9ad6988811495e73f346d33056f4cc66899e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:03.349201 systemd[1]: run-netns-cni\x2db7c5704b\x2d3f07\x2dcc8d\x2db436\x2d88f0189106e8.mount: Deactivated successfully. Jul 9 15:00:03.356941 containerd[1558]: time="2025-07-09T15:00:03.356644169Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b307366355c5805ef438c033fe9ad6988811495e73f346d33056f4cc66899e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:03.357352 kubelet[2817]: E0709 15:00:03.357233 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b307366355c5805ef438c033fe9ad6988811495e73f346d33056f4cc66899e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:03.358729 kubelet[2817]: E0709 15:00:03.357478 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b307366355c5805ef438c033fe9ad6988811495e73f346d33056f4cc66899e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sbhtt" Jul 9 15:00:03.358729 kubelet[2817]: E0709 15:00:03.357540 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b307366355c5805ef438c033fe9ad6988811495e73f346d33056f4cc66899e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sbhtt" Jul 9 15:00:03.358729 kubelet[2817]: E0709 15:00:03.357676 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-sbhtt_calico-system(2052df22-65ee-4914-9e6a-b1a620327c58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-sbhtt_calico-system(2052df22-65ee-4914-9e6a-b1a620327c58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b307366355c5805ef438c033fe9ad6988811495e73f346d33056f4cc66899e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 15:00:16.496923 kubelet[2817]: E0709 15:00:15.156357 2817 controller.go:195] "Failed to update lease" err="etcdserver: request timed out" Jul 9 15:00:16.545750 systemd[1]: cri-containerd-0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4.scope: Deactivated successfully. Jul 9 15:00:22.513619 containerd[1558]: time="2025-07-09T15:00:16.573719622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:22.513619 containerd[1558]: time="2025-07-09T15:00:16.599434819Z" level=info msg="received exit event container_id:\"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\" id:\"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\" pid:3965 exit_status:1 exited_at:{seconds:1752073216 nanos:594357043}" Jul 9 15:00:22.513619 containerd[1558]: time="2025-07-09T15:00:16.601347408Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\" id:\"0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4\" pid:3965 exit_status:1 exited_at:{seconds:1752073216 nanos:594357043}" Jul 9 15:00:22.516623 kubelet[2817]: E0709 15:00:16.560446 2817 kubelet.go:2512] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.053s" Jul 9 15:00:22.516623 kubelet[2817]: E0709 15:00:22.164226 2817 controller.go:195] "Failed to update lease" err="etcdserver: request timed out" Jul 9 15:00:16.661623 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4-rootfs.mount: Deactivated successfully. Jul 9 15:00:23.021160 kubelet[2817]: E0709 15:00:23.020736 2817 controller.go:195] "Failed to update lease" err="Operation cannot be fulfilled on leases.coordination.k8s.io \"ci-9999-9-100-bf645a1a30.novalocal\": the object has been modified; please apply your changes to the latest version and try again" Jul 9 15:00:24.282409 containerd[1558]: time="2025-07-09T15:00:24.280056594Z" level=error msg="Failed to destroy network for sandbox \"933cad3b47eed42fbafd9fe9b408aef2ee5ad739baf7bd806336483dcd2bd6cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:24.282785 systemd[1]: run-netns-cni\x2df2f7f922\x2d93c5\x2da07c\x2d5107\x2de428e57c9912.mount: Deactivated successfully. Jul 9 15:00:24.290336 containerd[1558]: time="2025-07-09T15:00:24.290284661Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"933cad3b47eed42fbafd9fe9b408aef2ee5ad739baf7bd806336483dcd2bd6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:24.292595 kubelet[2817]: E0709 15:00:24.292059 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"933cad3b47eed42fbafd9fe9b408aef2ee5ad739baf7bd806336483dcd2bd6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:24.292595 kubelet[2817]: E0709 15:00:24.292238 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"933cad3b47eed42fbafd9fe9b408aef2ee5ad739baf7bd806336483dcd2bd6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sbhtt" Jul 9 15:00:24.292595 kubelet[2817]: E0709 15:00:24.292299 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"933cad3b47eed42fbafd9fe9b408aef2ee5ad739baf7bd806336483dcd2bd6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sbhtt" Jul 9 15:00:24.292595 kubelet[2817]: E0709 15:00:24.292430 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-sbhtt_calico-system(2052df22-65ee-4914-9e6a-b1a620327c58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-sbhtt_calico-system(2052df22-65ee-4914-9e6a-b1a620327c58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"933cad3b47eed42fbafd9fe9b408aef2ee5ad739baf7bd806336483dcd2bd6cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-sbhtt" podUID="2052df22-65ee-4914-9e6a-b1a620327c58" Jul 9 15:00:24.615460 kubelet[2817]: I0709 15:00:24.615302 2817 scope.go:117] "RemoveContainer" containerID="4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a" Jul 9 15:00:24.617875 kubelet[2817]: I0709 15:00:24.617764 2817 scope.go:117] "RemoveContainer" containerID="0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4" Jul 9 15:00:24.618709 kubelet[2817]: E0709 15:00:24.618413 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=tigera-operator pod=tigera-operator-5bf8dfcb4-lhj85_tigera-operator(85d1f0d4-d203-460d-b655-273a9e721bdc)\"" pod="tigera-operator/tigera-operator-5bf8dfcb4-lhj85" podUID="85d1f0d4-d203-460d-b655-273a9e721bdc" Jul 9 15:00:24.625811 containerd[1558]: time="2025-07-09T15:00:24.624973141Z" level=info msg="RemoveContainer for \"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\"" Jul 9 15:00:24.654426 containerd[1558]: time="2025-07-09T15:00:24.654375880Z" level=info msg="RemoveContainer for \"4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a\" returns successfully" Jul 9 15:00:26.075052 systemd[1]: Created slice kubepods-burstable-podeb7c4fb4_78f9_47f9_87f5_24145e136152.slice - libcontainer container kubepods-burstable-podeb7c4fb4_78f9_47f9_87f5_24145e136152.slice. Jul 9 15:00:26.107944 systemd[1]: Created slice kubepods-burstable-podbcddc1ac_72ed_4d50_bae7_0d28ffbb33e9.slice - libcontainer container kubepods-burstable-podbcddc1ac_72ed_4d50_bae7_0d28ffbb33e9.slice. Jul 9 15:00:26.125515 systemd[1]: Created slice kubepods-besteffort-podee12b76c_10d0_472a_83d7_d8ece5511583.slice - libcontainer container kubepods-besteffort-podee12b76c_10d0_472a_83d7_d8ece5511583.slice. Jul 9 15:00:26.138885 systemd[1]: Created slice kubepods-besteffort-podcf44cb30_e140_42e1_a090_193656cf2eba.slice - libcontainer container kubepods-besteffort-podcf44cb30_e140_42e1_a090_193656cf2eba.slice. Jul 9 15:00:26.156513 systemd[1]: Created slice kubepods-besteffort-podc0f2baca_1ed7_48f2_9669_edb27549a99b.slice - libcontainer container kubepods-besteffort-podc0f2baca_1ed7_48f2_9669_edb27549a99b.slice. Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.171538 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/359bd74f-671b-4729-9999-52147917a66d-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-jhvtt\" (UID: \"359bd74f-671b-4729-9999-52147917a66d\") " pod="calico-system/goldmane-58fd7646b9-jhvtt" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.171758 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28zvg\" (UniqueName: \"kubernetes.io/projected/cf44cb30-e140-42e1-a090-193656cf2eba-kube-api-access-28zvg\") pod \"calico-apiserver-5c659cb4b9-2bwfx\" (UID: \"cf44cb30-e140-42e1-a090-193656cf2eba\") " pod="calico-apiserver/calico-apiserver-5c659cb4b9-2bwfx" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.171939 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxrkk\" (UniqueName: \"kubernetes.io/projected/bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9-kube-api-access-xxrkk\") pod \"coredns-7c65d6cfc9-7fst7\" (UID: \"bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9\") " pod="kube-system/coredns-7c65d6cfc9-7fst7" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.171967 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrzrq\" (UniqueName: \"kubernetes.io/projected/359bd74f-671b-4729-9999-52147917a66d-kube-api-access-mrzrq\") pod \"goldmane-58fd7646b9-jhvtt\" (UID: \"359bd74f-671b-4729-9999-52147917a66d\") " pod="calico-system/goldmane-58fd7646b9-jhvtt" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.172305 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9-config-volume\") pod \"coredns-7c65d6cfc9-7fst7\" (UID: \"bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9\") " pod="kube-system/coredns-7c65d6cfc9-7fst7" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.172330 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hqg2\" (UniqueName: \"kubernetes.io/projected/ee12b76c-10d0-472a-83d7-d8ece5511583-kube-api-access-4hqg2\") pod \"calico-kube-controllers-84d945fb8c-d8nqq\" (UID: \"ee12b76c-10d0-472a-83d7-d8ece5511583\") " pod="calico-system/calico-kube-controllers-84d945fb8c-d8nqq" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.172601 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cf44cb30-e140-42e1-a090-193656cf2eba-calico-apiserver-certs\") pod \"calico-apiserver-5c659cb4b9-2bwfx\" (UID: \"cf44cb30-e140-42e1-a090-193656cf2eba\") " pod="calico-apiserver/calico-apiserver-5c659cb4b9-2bwfx" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173541 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/359bd74f-671b-4729-9999-52147917a66d-config\") pod \"goldmane-58fd7646b9-jhvtt\" (UID: \"359bd74f-671b-4729-9999-52147917a66d\") " pod="calico-system/goldmane-58fd7646b9-jhvtt" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173618 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb7c4fb4-78f9-47f9-87f5-24145e136152-config-volume\") pod \"coredns-7c65d6cfc9-j7g9m\" (UID: \"eb7c4fb4-78f9-47f9-87f5-24145e136152\") " pod="kube-system/coredns-7c65d6cfc9-j7g9m" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173710 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ee12b76c-10d0-472a-83d7-d8ece5511583-tigera-ca-bundle\") pod \"calico-kube-controllers-84d945fb8c-d8nqq\" (UID: \"ee12b76c-10d0-472a-83d7-d8ece5511583\") " pod="calico-system/calico-kube-controllers-84d945fb8c-d8nqq" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173779 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c0f2baca-1ed7-48f2-9669-edb27549a99b-calico-apiserver-certs\") pod \"calico-apiserver-5c659cb4b9-8q7q2\" (UID: \"c0f2baca-1ed7-48f2-9669-edb27549a99b\") " pod="calico-apiserver/calico-apiserver-5c659cb4b9-8q7q2" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173826 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j45vw\" (UniqueName: \"kubernetes.io/projected/27a7d09a-2393-460a-acd1-909a60e162e7-kube-api-access-j45vw\") pod \"whisker-776d787996-n2k5h\" (UID: \"27a7d09a-2393-460a-acd1-909a60e162e7\") " pod="calico-system/whisker-776d787996-n2k5h" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173885 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jfn\" (UniqueName: \"kubernetes.io/projected/c0f2baca-1ed7-48f2-9669-edb27549a99b-kube-api-access-q6jfn\") pod \"calico-apiserver-5c659cb4b9-8q7q2\" (UID: \"c0f2baca-1ed7-48f2-9669-edb27549a99b\") " pod="calico-apiserver/calico-apiserver-5c659cb4b9-8q7q2" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173954 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-backend-key-pair\") pod \"whisker-776d787996-n2k5h\" (UID: \"27a7d09a-2393-460a-acd1-909a60e162e7\") " pod="calico-system/whisker-776d787996-n2k5h" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.173988 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-ca-bundle\") pod \"whisker-776d787996-n2k5h\" (UID: \"27a7d09a-2393-460a-acd1-909a60e162e7\") " pod="calico-system/whisker-776d787996-n2k5h" Jul 9 15:00:26.177370 kubelet[2817]: I0709 15:00:26.174058 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f49j\" (UniqueName: \"kubernetes.io/projected/eb7c4fb4-78f9-47f9-87f5-24145e136152-kube-api-access-6f49j\") pod \"coredns-7c65d6cfc9-j7g9m\" (UID: \"eb7c4fb4-78f9-47f9-87f5-24145e136152\") " pod="kube-system/coredns-7c65d6cfc9-j7g9m" Jul 9 15:00:26.187415 kubelet[2817]: I0709 15:00:26.174089 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/359bd74f-671b-4729-9999-52147917a66d-goldmane-key-pair\") pod \"goldmane-58fd7646b9-jhvtt\" (UID: \"359bd74f-671b-4729-9999-52147917a66d\") " pod="calico-system/goldmane-58fd7646b9-jhvtt" Jul 9 15:00:26.188100 systemd[1]: Created slice kubepods-besteffort-pod27a7d09a_2393_460a_acd1_909a60e162e7.slice - libcontainer container kubepods-besteffort-pod27a7d09a_2393_460a_acd1_909a60e162e7.slice. Jul 9 15:00:26.198945 systemd[1]: Created slice kubepods-besteffort-pod359bd74f_671b_4729_9999_52147917a66d.slice - libcontainer container kubepods-besteffort-pod359bd74f_671b_4729_9999_52147917a66d.slice. Jul 9 15:00:26.284219 kubelet[2817]: E0709 15:00:26.284139 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-j45vw whisker-backend-key-pair whisker-ca-bundle], unattached volumes=[], failed to process volumes=[]: context canceled" pod="calico-system/whisker-776d787996-n2k5h" podUID="27a7d09a-2393-460a-acd1-909a60e162e7" Jul 9 15:00:26.473109 containerd[1558]: time="2025-07-09T15:00:26.470909322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-2bwfx,Uid:cf44cb30-e140-42e1-a090-193656cf2eba,Namespace:calico-apiserver,Attempt:0,}" Jul 9 15:00:26.615151 containerd[1558]: time="2025-07-09T15:00:26.606682925Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-jhvtt,Uid:359bd74f-671b-4729-9999-52147917a66d,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:26.710540 containerd[1558]: time="2025-07-09T15:00:26.710057251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-j7g9m,Uid:eb7c4fb4-78f9-47f9-87f5-24145e136152,Namespace:kube-system,Attempt:0,}" Jul 9 15:00:26.735063 containerd[1558]: time="2025-07-09T15:00:26.734865369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84d945fb8c-d8nqq,Uid:ee12b76c-10d0-472a-83d7-d8ece5511583,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:26.736104 containerd[1558]: time="2025-07-09T15:00:26.735930223Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7fst7,Uid:bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9,Namespace:kube-system,Attempt:0,}" Jul 9 15:00:26.778138 containerd[1558]: time="2025-07-09T15:00:26.778057541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-8q7q2,Uid:c0f2baca-1ed7-48f2-9669-edb27549a99b,Namespace:calico-apiserver,Attempt:0,}" Jul 9 15:00:26.856650 kubelet[2817]: I0709 15:00:26.856592 2817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j45vw\" (UniqueName: \"kubernetes.io/projected/27a7d09a-2393-460a-acd1-909a60e162e7-kube-api-access-j45vw\") pod \"27a7d09a-2393-460a-acd1-909a60e162e7\" (UID: \"27a7d09a-2393-460a-acd1-909a60e162e7\") " Jul 9 15:00:26.857265 kubelet[2817]: I0709 15:00:26.857028 2817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-ca-bundle\") pod \"27a7d09a-2393-460a-acd1-909a60e162e7\" (UID: \"27a7d09a-2393-460a-acd1-909a60e162e7\") " Jul 9 15:00:26.857265 kubelet[2817]: I0709 15:00:26.857113 2817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-backend-key-pair\") pod \"27a7d09a-2393-460a-acd1-909a60e162e7\" (UID: \"27a7d09a-2393-460a-acd1-909a60e162e7\") " Jul 9 15:00:26.866138 kubelet[2817]: I0709 15:00:26.865293 2817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "27a7d09a-2393-460a-acd1-909a60e162e7" (UID: "27a7d09a-2393-460a-acd1-909a60e162e7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 9 15:00:26.890160 kubelet[2817]: I0709 15:00:26.889750 2817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "27a7d09a-2393-460a-acd1-909a60e162e7" (UID: "27a7d09a-2393-460a-acd1-909a60e162e7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 9 15:00:26.894275 kubelet[2817]: I0709 15:00:26.894202 2817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a7d09a-2393-460a-acd1-909a60e162e7-kube-api-access-j45vw" (OuterVolumeSpecName: "kube-api-access-j45vw") pod "27a7d09a-2393-460a-acd1-909a60e162e7" (UID: "27a7d09a-2393-460a-acd1-909a60e162e7"). InnerVolumeSpecName "kube-api-access-j45vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 9 15:00:26.958220 kubelet[2817]: I0709 15:00:26.958083 2817 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-backend-key-pair\") on node \"ci-9999-9-100-bf645a1a30.novalocal\" DevicePath \"\"" Jul 9 15:00:26.958220 kubelet[2817]: I0709 15:00:26.958154 2817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j45vw\" (UniqueName: \"kubernetes.io/projected/27a7d09a-2393-460a-acd1-909a60e162e7-kube-api-access-j45vw\") on node \"ci-9999-9-100-bf645a1a30.novalocal\" DevicePath \"\"" Jul 9 15:00:26.958220 kubelet[2817]: I0709 15:00:26.958171 2817 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27a7d09a-2393-460a-acd1-909a60e162e7-whisker-ca-bundle\") on node \"ci-9999-9-100-bf645a1a30.novalocal\" DevicePath \"\"" Jul 9 15:00:27.023181 containerd[1558]: time="2025-07-09T15:00:27.022864662Z" level=error msg="Failed to destroy network for sandbox \"4604f6dfb8a4e56793eaa1895509e9c7abf3bee06c82f28afed98fc77463f6bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.026105 containerd[1558]: time="2025-07-09T15:00:27.025956383Z" level=error msg="Failed to destroy network for sandbox \"72c1d9bbb6060668a362ad16bb0a1e4c8bf30080cd463a699996e50cba4458f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.130764 containerd[1558]: time="2025-07-09T15:00:27.130212431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-jhvtt,Uid:359bd74f-671b-4729-9999-52147917a66d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4604f6dfb8a4e56793eaa1895509e9c7abf3bee06c82f28afed98fc77463f6bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.132964 kubelet[2817]: E0709 15:00:27.132756 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4604f6dfb8a4e56793eaa1895509e9c7abf3bee06c82f28afed98fc77463f6bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.133440 kubelet[2817]: E0709 15:00:27.133294 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4604f6dfb8a4e56793eaa1895509e9c7abf3bee06c82f28afed98fc77463f6bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-jhvtt" Jul 9 15:00:27.133440 kubelet[2817]: E0709 15:00:27.133400 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4604f6dfb8a4e56793eaa1895509e9c7abf3bee06c82f28afed98fc77463f6bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-jhvtt" Jul 9 15:00:27.134419 kubelet[2817]: E0709 15:00:27.133921 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-jhvtt_calico-system(359bd74f-671b-4729-9999-52147917a66d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-jhvtt_calico-system(359bd74f-671b-4729-9999-52147917a66d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4604f6dfb8a4e56793eaa1895509e9c7abf3bee06c82f28afed98fc77463f6bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-jhvtt" podUID="359bd74f-671b-4729-9999-52147917a66d" Jul 9 15:00:27.140877 containerd[1558]: time="2025-07-09T15:00:27.140785976Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-2bwfx,Uid:cf44cb30-e140-42e1-a090-193656cf2eba,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c1d9bbb6060668a362ad16bb0a1e4c8bf30080cd463a699996e50cba4458f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.143810 kubelet[2817]: E0709 15:00:27.143500 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c1d9bbb6060668a362ad16bb0a1e4c8bf30080cd463a699996e50cba4458f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.143810 kubelet[2817]: E0709 15:00:27.143624 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c1d9bbb6060668a362ad16bb0a1e4c8bf30080cd463a699996e50cba4458f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c659cb4b9-2bwfx" Jul 9 15:00:27.143810 kubelet[2817]: E0709 15:00:27.143660 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c1d9bbb6060668a362ad16bb0a1e4c8bf30080cd463a699996e50cba4458f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c659cb4b9-2bwfx" Jul 9 15:00:27.143810 kubelet[2817]: E0709 15:00:27.143731 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c659cb4b9-2bwfx_calico-apiserver(cf44cb30-e140-42e1-a090-193656cf2eba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c659cb4b9-2bwfx_calico-apiserver(cf44cb30-e140-42e1-a090-193656cf2eba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72c1d9bbb6060668a362ad16bb0a1e4c8bf30080cd463a699996e50cba4458f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c659cb4b9-2bwfx" podUID="cf44cb30-e140-42e1-a090-193656cf2eba" Jul 9 15:00:27.197079 containerd[1558]: time="2025-07-09T15:00:27.196741714Z" level=error msg="Failed to destroy network for sandbox \"e0365f3169bba9a7b98e15f7eacbcddd80e2487ccbf960cce0f4f98078511c45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.209039 containerd[1558]: time="2025-07-09T15:00:27.207187929Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-j7g9m,Uid:eb7c4fb4-78f9-47f9-87f5-24145e136152,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0365f3169bba9a7b98e15f7eacbcddd80e2487ccbf960cce0f4f98078511c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.209303 kubelet[2817]: E0709 15:00:27.207581 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0365f3169bba9a7b98e15f7eacbcddd80e2487ccbf960cce0f4f98078511c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.209303 kubelet[2817]: E0709 15:00:27.207686 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0365f3169bba9a7b98e15f7eacbcddd80e2487ccbf960cce0f4f98078511c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-j7g9m" Jul 9 15:00:27.209303 kubelet[2817]: E0709 15:00:27.207715 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0365f3169bba9a7b98e15f7eacbcddd80e2487ccbf960cce0f4f98078511c45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-j7g9m" Jul 9 15:00:27.209303 kubelet[2817]: E0709 15:00:27.207771 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-j7g9m_kube-system(eb7c4fb4-78f9-47f9-87f5-24145e136152)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-j7g9m_kube-system(eb7c4fb4-78f9-47f9-87f5-24145e136152)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0365f3169bba9a7b98e15f7eacbcddd80e2487ccbf960cce0f4f98078511c45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-j7g9m" podUID="eb7c4fb4-78f9-47f9-87f5-24145e136152" Jul 9 15:00:27.217865 containerd[1558]: time="2025-07-09T15:00:27.217667807Z" level=error msg="Failed to destroy network for sandbox \"636ce9c4917790cd1107512ff69ac33c34f2980e2a4ca7f81146eebc670627e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.222441 containerd[1558]: time="2025-07-09T15:00:27.222051516Z" level=error msg="Failed to destroy network for sandbox \"383a4a8cbfcc02f6627eabdd91c274a589f6699f5b5a329d640a4f9c4f1a0800\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.229818 containerd[1558]: time="2025-07-09T15:00:27.229750799Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-8q7q2,Uid:c0f2baca-1ed7-48f2-9669-edb27549a99b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"636ce9c4917790cd1107512ff69ac33c34f2980e2a4ca7f81146eebc670627e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.230648 kubelet[2817]: E0709 15:00:27.230550 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"636ce9c4917790cd1107512ff69ac33c34f2980e2a4ca7f81146eebc670627e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.231270 kubelet[2817]: E0709 15:00:27.231210 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"636ce9c4917790cd1107512ff69ac33c34f2980e2a4ca7f81146eebc670627e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c659cb4b9-8q7q2" Jul 9 15:00:27.231409 kubelet[2817]: E0709 15:00:27.231380 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"636ce9c4917790cd1107512ff69ac33c34f2980e2a4ca7f81146eebc670627e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c659cb4b9-8q7q2" Jul 9 15:00:27.232308 kubelet[2817]: E0709 15:00:27.231619 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c659cb4b9-8q7q2_calico-apiserver(c0f2baca-1ed7-48f2-9669-edb27549a99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c659cb4b9-8q7q2_calico-apiserver(c0f2baca-1ed7-48f2-9669-edb27549a99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"636ce9c4917790cd1107512ff69ac33c34f2980e2a4ca7f81146eebc670627e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c659cb4b9-8q7q2" podUID="c0f2baca-1ed7-48f2-9669-edb27549a99b" Jul 9 15:00:27.237648 containerd[1558]: time="2025-07-09T15:00:27.237566663Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7fst7,Uid:bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"383a4a8cbfcc02f6627eabdd91c274a589f6699f5b5a329d640a4f9c4f1a0800\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.238606 containerd[1558]: time="2025-07-09T15:00:27.238292972Z" level=error msg="Failed to destroy network for sandbox \"83bae3dbd4d865da158a997e14d7f22bccc3decd5d444b41008a0a142d228e77\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.239301 kubelet[2817]: E0709 15:00:27.238396 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"383a4a8cbfcc02f6627eabdd91c274a589f6699f5b5a329d640a4f9c4f1a0800\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.239301 kubelet[2817]: E0709 15:00:27.239178 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"383a4a8cbfcc02f6627eabdd91c274a589f6699f5b5a329d640a4f9c4f1a0800\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7fst7" Jul 9 15:00:27.239301 kubelet[2817]: E0709 15:00:27.239213 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"383a4a8cbfcc02f6627eabdd91c274a589f6699f5b5a329d640a4f9c4f1a0800\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7fst7" Jul 9 15:00:27.240229 kubelet[2817]: E0709 15:00:27.240080 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-7fst7_kube-system(bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-7fst7_kube-system(bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"383a4a8cbfcc02f6627eabdd91c274a589f6699f5b5a329d640a4f9c4f1a0800\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-7fst7" podUID="bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9" Jul 9 15:00:27.243548 containerd[1558]: time="2025-07-09T15:00:27.243480538Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84d945fb8c-d8nqq,Uid:ee12b76c-10d0-472a-83d7-d8ece5511583,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"83bae3dbd4d865da158a997e14d7f22bccc3decd5d444b41008a0a142d228e77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.244646 kubelet[2817]: E0709 15:00:27.243990 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83bae3dbd4d865da158a997e14d7f22bccc3decd5d444b41008a0a142d228e77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:27.244646 kubelet[2817]: E0709 15:00:27.244061 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83bae3dbd4d865da158a997e14d7f22bccc3decd5d444b41008a0a142d228e77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84d945fb8c-d8nqq" Jul 9 15:00:27.244646 kubelet[2817]: E0709 15:00:27.244086 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83bae3dbd4d865da158a997e14d7f22bccc3decd5d444b41008a0a142d228e77\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-84d945fb8c-d8nqq" Jul 9 15:00:27.244646 kubelet[2817]: E0709 15:00:27.244154 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-84d945fb8c-d8nqq_calico-system(ee12b76c-10d0-472a-83d7-d8ece5511583)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-84d945fb8c-d8nqq_calico-system(ee12b76c-10d0-472a-83d7-d8ece5511583)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83bae3dbd4d865da158a997e14d7f22bccc3decd5d444b41008a0a142d228e77\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-84d945fb8c-d8nqq" podUID="ee12b76c-10d0-472a-83d7-d8ece5511583" Jul 9 15:00:27.341313 systemd[1]: var-lib-kubelet-pods-27a7d09a\x2d2393\x2d460a\x2dacd1\x2d909a60e162e7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 9 15:00:27.674759 systemd[1]: Removed slice kubepods-besteffort-pod27a7d09a_2393_460a_acd1_909a60e162e7.slice - libcontainer container kubepods-besteffort-pod27a7d09a_2393_460a_acd1_909a60e162e7.slice. Jul 9 15:00:28.512131 kubelet[2817]: I0709 15:00:28.512081 2817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27a7d09a-2393-460a-acd1-909a60e162e7" path="/var/lib/kubelet/pods/27a7d09a-2393-460a-acd1-909a60e162e7/volumes" Jul 9 15:00:28.534498 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3096223186.mount: Deactivated successfully. Jul 9 15:00:36.977396 containerd[1558]: time="2025-07-09T15:00:36.976945046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:00:36.994496 containerd[1558]: time="2025-07-09T15:00:36.993418750Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 9 15:00:37.021574 containerd[1558]: time="2025-07-09T15:00:37.021478409Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:00:37.048826 containerd[1558]: time="2025-07-09T15:00:37.048707753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:00:37.050269 containerd[1558]: time="2025-07-09T15:00:37.049692180Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 48.286963419s" Jul 9 15:00:37.050269 containerd[1558]: time="2025-07-09T15:00:37.049770757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 9 15:00:37.130912 containerd[1558]: time="2025-07-09T15:00:37.130802842Z" level=info msg="CreateContainer within sandbox \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 9 15:00:37.157727 systemd[1]: Created slice kubepods-besteffort-pod33b985a0_8206_4e6c_9e8e_54c920a706c5.slice - libcontainer container kubepods-besteffort-pod33b985a0_8206_4e6c_9e8e_54c920a706c5.slice. Jul 9 15:00:37.249930 containerd[1558]: time="2025-07-09T15:00:37.248184528Z" level=info msg="Container 00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:00:37.265239 kubelet[2817]: I0709 15:00:37.265047 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-ca-bundle\") pod \"whisker-5b97bcbb6-ssbg8\" (UID: \"33b985a0-8206-4e6c-9e8e-54c920a706c5\") " pod="calico-system/whisker-5b97bcbb6-ssbg8" Jul 9 15:00:37.265239 kubelet[2817]: I0709 15:00:37.265191 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-backend-key-pair\") pod \"whisker-5b97bcbb6-ssbg8\" (UID: \"33b985a0-8206-4e6c-9e8e-54c920a706c5\") " pod="calico-system/whisker-5b97bcbb6-ssbg8" Jul 9 15:00:37.265239 kubelet[2817]: I0709 15:00:37.265252 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7fb\" (UniqueName: \"kubernetes.io/projected/33b985a0-8206-4e6c-9e8e-54c920a706c5-kube-api-access-nj7fb\") pod \"whisker-5b97bcbb6-ssbg8\" (UID: \"33b985a0-8206-4e6c-9e8e-54c920a706c5\") " pod="calico-system/whisker-5b97bcbb6-ssbg8" Jul 9 15:00:37.419028 containerd[1558]: time="2025-07-09T15:00:37.417816286Z" level=info msg="CreateContainer within sandbox \"09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\"" Jul 9 15:00:37.419469 containerd[1558]: time="2025-07-09T15:00:37.419426954Z" level=info msg="StartContainer for \"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\"" Jul 9 15:00:37.422715 containerd[1558]: time="2025-07-09T15:00:37.422678215Z" level=info msg="connecting to shim 00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd" address="unix:///run/containerd/s/c728b76101f95f4b4b808e74a0222358b809720afc928cd20dc9dfddd726edef" protocol=ttrpc version=3 Jul 9 15:00:37.464873 systemd[1]: Started cri-containerd-00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd.scope - libcontainer container 00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd. Jul 9 15:00:37.469491 containerd[1558]: time="2025-07-09T15:00:37.469398150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b97bcbb6-ssbg8,Uid:33b985a0-8206-4e6c-9e8e-54c920a706c5,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:37.510542 kubelet[2817]: I0709 15:00:37.509366 2817 scope.go:117] "RemoveContainer" containerID="0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4" Jul 9 15:00:37.510718 containerd[1558]: time="2025-07-09T15:00:37.509850617Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:38.504614 containerd[1558]: time="2025-07-09T15:00:38.504072195Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for container &ContainerMetadata{Name:tigera-operator,Attempt:3,}" Jul 9 15:00:38.519511 containerd[1558]: time="2025-07-09T15:00:38.519367425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-2bwfx,Uid:cf44cb30-e140-42e1-a090-193656cf2eba,Namespace:calico-apiserver,Attempt:0,}" Jul 9 15:00:38.521507 containerd[1558]: time="2025-07-09T15:00:38.521402292Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7fst7,Uid:bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9,Namespace:kube-system,Attempt:0,}" Jul 9 15:00:38.522908 containerd[1558]: time="2025-07-09T15:00:38.522029154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-j7g9m,Uid:eb7c4fb4-78f9-47f9-87f5-24145e136152,Namespace:kube-system,Attempt:0,}" Jul 9 15:00:38.522908 containerd[1558]: time="2025-07-09T15:00:38.522525821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-jhvtt,Uid:359bd74f-671b-4729-9999-52147917a66d,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:38.608874 containerd[1558]: time="2025-07-09T15:00:38.608286997Z" level=info msg="StartContainer for \"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" returns successfully" Jul 9 15:00:39.388076 containerd[1558]: time="2025-07-09T15:00:39.387990888Z" level=error msg="Failed to destroy network for sandbox \"e41314d6c48f11d1d8de15682a0412d54c689054b1366fe16ee9128156a4b5bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:43.360588 containerd[1558]: time="2025-07-09T15:00:41.512221359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84d945fb8c-d8nqq,Uid:ee12b76c-10d0-472a-83d7-d8ece5511583,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:43.360588 containerd[1558]: time="2025-07-09T15:00:42.511573357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-8q7q2,Uid:c0f2baca-1ed7-48f2-9669-edb27549a99b,Namespace:calico-apiserver,Attempt:0,}" Jul 9 15:00:39.390858 systemd[1]: run-netns-cni\x2d39791964\x2d041b\x2dc813\x2d524a\x2d989291fb9048.mount: Deactivated successfully. Jul 9 15:00:43.447872 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 9 15:00:43.448085 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 9 15:00:43.484939 containerd[1558]: time="2025-07-09T15:00:43.484889461Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"8e06062be2aa57a8b80f50d42ea0e792582f503f54ba9ae3fc299f0834fbb7dc\" pid:4348 exit_status:1 exited_at:{seconds:1752073243 nanos:483930984}" Jul 9 15:00:43.727827 containerd[1558]: time="2025-07-09T15:00:43.727700990Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"c5f6ac43dc26786eb3ec0e70b860acf5248b1e0be7fadb0b3c294ca1ab5aaf38\" pid:4373 exit_status:1 exited_at:{seconds:1752073243 nanos:726687960}" Jul 9 15:00:45.111499 containerd[1558]: time="2025-07-09T15:00:45.110413977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b97bcbb6-ssbg8,Uid:33b985a0-8206-4e6c-9e8e-54c920a706c5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41314d6c48f11d1d8de15682a0412d54c689054b1366fe16ee9128156a4b5bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:45.117526 kubelet[2817]: E0709 15:00:45.115863 2817 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41314d6c48f11d1d8de15682a0412d54c689054b1366fe16ee9128156a4b5bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 9 15:00:45.117526 kubelet[2817]: E0709 15:00:45.116889 2817 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41314d6c48f11d1d8de15682a0412d54c689054b1366fe16ee9128156a4b5bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b97bcbb6-ssbg8" Jul 9 15:00:45.117526 kubelet[2817]: E0709 15:00:45.117069 2817 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e41314d6c48f11d1d8de15682a0412d54c689054b1366fe16ee9128156a4b5bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5b97bcbb6-ssbg8" Jul 9 15:00:45.121661 kubelet[2817]: E0709 15:00:45.120056 2817 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5b97bcbb6-ssbg8_calico-system(33b985a0-8206-4e6c-9e8e-54c920a706c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5b97bcbb6-ssbg8_calico-system(33b985a0-8206-4e6c-9e8e-54c920a706c5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e41314d6c48f11d1d8de15682a0412d54c689054b1366fe16ee9128156a4b5bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5b97bcbb6-ssbg8" podUID="33b985a0-8206-4e6c-9e8e-54c920a706c5" Jul 9 15:00:45.778507 containerd[1558]: time="2025-07-09T15:00:45.777140682Z" level=info msg="Container d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:00:45.812181 containerd[1558]: time="2025-07-09T15:00:45.812103832Z" level=info msg="CreateContainer within sandbox \"34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89\" for &ContainerMetadata{Name:tigera-operator,Attempt:3,} returns container id \"d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09\"" Jul 9 15:00:45.816144 containerd[1558]: time="2025-07-09T15:00:45.816036906Z" level=info msg="StartContainer for \"d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09\"" Jul 9 15:00:45.826485 containerd[1558]: time="2025-07-09T15:00:45.825640039Z" level=info msg="connecting to shim d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09" address="unix:///run/containerd/s/2cbdfe341178e08aaffb918245755042099acbdbbc5d0bb9c5629dad14c7d015" protocol=ttrpc version=3 Jul 9 15:00:46.220488 systemd[1]: Started cri-containerd-d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09.scope - libcontainer container d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09. Jul 9 15:00:46.537786 containerd[1558]: time="2025-07-09T15:00:46.537609423Z" level=info msg="StartContainer for \"d33f42d58a284d873e3990622964297a4c281fb6ee9fe86c30ac3dc37b535b09\" returns successfully" Jul 9 15:00:46.717658 kubelet[2817]: I0709 15:00:46.717113 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-4mm6r" podStartSLOduration=10.475277562 podStartE2EDuration="2m32.716955443s" podCreationTimestamp="2025-07-09 14:58:14 +0000 UTC" firstStartedPulling="2025-07-09 14:58:14.809361398 +0000 UTC m=+24.492662109" lastFinishedPulling="2025-07-09 15:00:37.051039279 +0000 UTC m=+166.734339990" observedRunningTime="2025-07-09 15:00:43.450093422 +0000 UTC m=+173.133394163" watchObservedRunningTime="2025-07-09 15:00:46.716955443 +0000 UTC m=+176.400256154" Jul 9 15:00:46.761700 systemd-networkd[1444]: cali113ebcb3a43: Link UP Jul 9 15:00:46.762006 systemd-networkd[1444]: cali113ebcb3a43: Gained carrier Jul 9 15:00:46.822228 containerd[1558]: 2025-07-09 15:00:45.614 [INFO][4419] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:46.822228 containerd[1558]: 2025-07-09 15:00:45.842 [INFO][4419] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0 calico-apiserver-5c659cb4b9- calico-apiserver cf44cb30-e140-42e1-a090-193656cf2eba 1004 0 2025-07-09 14:58:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c659cb4b9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal calico-apiserver-5c659cb4b9-2bwfx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali113ebcb3a43 [] [] }} ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-" Jul 9 15:00:46.822228 containerd[1558]: 2025-07-09 15:00:45.842 [INFO][4419] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.822228 containerd[1558]: 2025-07-09 15:00:46.477 [INFO][4528] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" HandleID="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.484 [INFO][4528] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" HandleID="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f990), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"calico-apiserver-5c659cb4b9-2bwfx", "timestamp":"2025-07-09 15:00:46.477107352 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.485 [INFO][4528] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.485 [INFO][4528] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.485 [INFO][4528] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.519 [INFO][4528] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.562 [INFO][4528] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.601 [INFO][4528] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.613 [INFO][4528] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.823136 containerd[1558]: 2025-07-09 15:00:46.619 [INFO][4528] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.619 [INFO][4528] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.626 [INFO][4528] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.646 [INFO][4528] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.684 [INFO][4528] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.193/26] block=192.168.111.192/26 handle="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.685 [INFO][4528] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.193/26] handle="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.686 [INFO][4528] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:46.825530 containerd[1558]: 2025-07-09 15:00:46.687 [INFO][4528] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.193/26] IPv6=[] ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" HandleID="k8s-pod-network.91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.825786 containerd[1558]: 2025-07-09 15:00:46.717 [INFO][4419] cni-plugin/k8s.go 418: Populated endpoint ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0", GenerateName:"calico-apiserver-5c659cb4b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"cf44cb30-e140-42e1-a090-193656cf2eba", ResourceVersion:"1004", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c659cb4b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"calico-apiserver-5c659cb4b9-2bwfx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.111.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali113ebcb3a43", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:46.825892 containerd[1558]: 2025-07-09 15:00:46.719 [INFO][4419] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.193/32] ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.825892 containerd[1558]: 2025-07-09 15:00:46.719 [INFO][4419] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali113ebcb3a43 ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.825892 containerd[1558]: 2025-07-09 15:00:46.765 [INFO][4419] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.826021 containerd[1558]: 2025-07-09 15:00:46.766 [INFO][4419] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0", GenerateName:"calico-apiserver-5c659cb4b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"cf44cb30-e140-42e1-a090-193656cf2eba", ResourceVersion:"1004", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c659cb4b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f", Pod:"calico-apiserver-5c659cb4b9-2bwfx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.111.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali113ebcb3a43", MAC:"2a:41:bb:b6:7a:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:46.826115 containerd[1558]: 2025-07-09 15:00:46.812 [INFO][4419] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-2bwfx" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--2bwfx-eth0" Jul 9 15:00:46.917433 systemd-networkd[1444]: cali280040ee528: Link UP Jul 9 15:00:46.917865 systemd-networkd[1444]: cali280040ee528: Gained carrier Jul 9 15:00:46.979679 containerd[1558]: 2025-07-09 15:00:45.495 [INFO][4407] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:46.979679 containerd[1558]: 2025-07-09 15:00:45.823 [INFO][4407] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0 csi-node-driver- calico-system 2052df22-65ee-4914-9e6a-b1a620327c58 724 0 2025-07-09 14:58:14 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal csi-node-driver-sbhtt eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali280040ee528 [] [] }} ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-" Jul 9 15:00:46.979679 containerd[1558]: 2025-07-09 15:00:45.823 [INFO][4407] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:46.979679 containerd[1558]: 2025-07-09 15:00:46.523 [INFO][4524] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" HandleID="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.535 [INFO][4524] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" HandleID="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000391430), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"csi-node-driver-sbhtt", "timestamp":"2025-07-09 15:00:46.523036272 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.535 [INFO][4524] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.686 [INFO][4524] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.691 [INFO][4524] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.723 [INFO][4524] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.755 [INFO][4524] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.805 [INFO][4524] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.818 [INFO][4524] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.980132 containerd[1558]: 2025-07-09 15:00:46.833 [INFO][4524] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.833 [INFO][4524] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.846 [INFO][4524] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.860 [INFO][4524] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.900 [INFO][4524] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.194/26] block=192.168.111.192/26 handle="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.900 [INFO][4524] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.194/26] handle="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.900 [INFO][4524] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:46.981973 containerd[1558]: 2025-07-09 15:00:46.900 [INFO][4524] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.194/26] IPv6=[] ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" HandleID="k8s-pod-network.e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:46.982632 containerd[1558]: 2025-07-09 15:00:46.906 [INFO][4407] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2052df22-65ee-4914-9e6a-b1a620327c58", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"csi-node-driver-sbhtt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.111.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali280040ee528", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:46.983424 containerd[1558]: 2025-07-09 15:00:46.906 [INFO][4407] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.194/32] ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:46.983424 containerd[1558]: 2025-07-09 15:00:46.906 [INFO][4407] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali280040ee528 ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:46.983424 containerd[1558]: 2025-07-09 15:00:46.921 [INFO][4407] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:46.983570 containerd[1558]: 2025-07-09 15:00:46.923 [INFO][4407] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2052df22-65ee-4914-9e6a-b1a620327c58", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec", Pod:"csi-node-driver-sbhtt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.111.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali280040ee528", MAC:"a6:f9:50:bb:74:e3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:46.983655 containerd[1558]: 2025-07-09 15:00:46.974 [INFO][4407] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" Namespace="calico-system" Pod="csi-node-driver-sbhtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-csi--node--driver--sbhtt-eth0" Jul 9 15:00:47.093928 systemd-networkd[1444]: cali068eb0fed98: Link UP Jul 9 15:00:47.098183 systemd-networkd[1444]: cali068eb0fed98: Gained carrier Jul 9 15:00:47.142835 containerd[1558]: time="2025-07-09T15:00:47.142724381Z" level=info msg="connecting to shim 91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f" address="unix:///run/containerd/s/631500851695963479ff4db819e3079cc957c88e5a7e5e5d8ba1d6b0195fe458" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:47.173778 containerd[1558]: 2025-07-09 15:00:46.059 [INFO][4470] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:47.173778 containerd[1558]: 2025-07-09 15:00:46.169 [INFO][4470] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0 calico-apiserver-5c659cb4b9- calico-apiserver c0f2baca-1ed7-48f2-9669-edb27549a99b 1005 0 2025-07-09 14:58:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c659cb4b9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal calico-apiserver-5c659cb4b9-8q7q2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali068eb0fed98 [] [] }} ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-" Jul 9 15:00:47.173778 containerd[1558]: 2025-07-09 15:00:46.169 [INFO][4470] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.173778 containerd[1558]: 2025-07-09 15:00:46.582 [INFO][4596] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" HandleID="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.583 [INFO][4596] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" HandleID="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ae520), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"calico-apiserver-5c659cb4b9-8q7q2", "timestamp":"2025-07-09 15:00:46.577419538 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.586 [INFO][4596] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.900 [INFO][4596] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.900 [INFO][4596] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.930 [INFO][4596] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.955 [INFO][4596] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:46.996 [INFO][4596] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:47.003 [INFO][4596] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.174320 containerd[1558]: 2025-07-09 15:00:47.011 [INFO][4596] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.011 [INFO][4596] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.016 [INFO][4596] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.049 [INFO][4596] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.069 [INFO][4596] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.195/26] block=192.168.111.192/26 handle="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.069 [INFO][4596] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.195/26] handle="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.069 [INFO][4596] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:47.176854 containerd[1558]: 2025-07-09 15:00:47.069 [INFO][4596] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.195/26] IPv6=[] ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" HandleID="k8s-pod-network.088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.179030 containerd[1558]: 2025-07-09 15:00:47.083 [INFO][4470] cni-plugin/k8s.go 418: Populated endpoint ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0", GenerateName:"calico-apiserver-5c659cb4b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"c0f2baca-1ed7-48f2-9669-edb27549a99b", ResourceVersion:"1005", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c659cb4b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"calico-apiserver-5c659cb4b9-8q7q2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.111.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali068eb0fed98", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.180044 containerd[1558]: 2025-07-09 15:00:47.086 [INFO][4470] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.195/32] ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.180044 containerd[1558]: 2025-07-09 15:00:47.086 [INFO][4470] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali068eb0fed98 ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.180044 containerd[1558]: 2025-07-09 15:00:47.097 [INFO][4470] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.181199 containerd[1558]: 2025-07-09 15:00:47.102 [INFO][4470] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0", GenerateName:"calico-apiserver-5c659cb4b9-", Namespace:"calico-apiserver", SelfLink:"", UID:"c0f2baca-1ed7-48f2-9669-edb27549a99b", ResourceVersion:"1005", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c659cb4b9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df", Pod:"calico-apiserver-5c659cb4b9-8q7q2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.111.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali068eb0fed98", MAC:"72:2c:68:f7:80:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.181302 containerd[1558]: 2025-07-09 15:00:47.159 [INFO][4470] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" Namespace="calico-apiserver" Pod="calico-apiserver-5c659cb4b9-8q7q2" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--apiserver--5c659cb4b9--8q7q2-eth0" Jul 9 15:00:47.247568 containerd[1558]: time="2025-07-09T15:00:47.247380930Z" level=info msg="connecting to shim e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec" address="unix:///run/containerd/s/bac8065457c681a8c08deb425609cff07be402208a83fd4c2967711fff66db57" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:47.284001 systemd[1]: Started cri-containerd-91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f.scope - libcontainer container 91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f. Jul 9 15:00:47.409690 containerd[1558]: time="2025-07-09T15:00:47.407304108Z" level=info msg="connecting to shim 088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df" address="unix:///run/containerd/s/b520090a4847e61fec2db5161aa3565207466441886f6b3c335009fd232ed731" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:47.424706 systemd-networkd[1444]: calieb0203028d2: Link UP Jul 9 15:00:47.426177 systemd-networkd[1444]: calieb0203028d2: Gained carrier Jul 9 15:00:47.474842 systemd[1]: Started cri-containerd-088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df.scope - libcontainer container 088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df. Jul 9 15:00:47.493593 systemd[1]: Started cri-containerd-e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec.scope - libcontainer container e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec. Jul 9 15:00:47.507378 containerd[1558]: 2025-07-09 15:00:46.207 [INFO][4431] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:47.507378 containerd[1558]: 2025-07-09 15:00:46.290 [INFO][4431] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0 coredns-7c65d6cfc9- kube-system bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9 986 0 2025-07-09 14:57:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal coredns-7c65d6cfc9-7fst7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calieb0203028d2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-" Jul 9 15:00:47.507378 containerd[1558]: 2025-07-09 15:00:46.290 [INFO][4431] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.507378 containerd[1558]: 2025-07-09 15:00:46.647 [INFO][4621] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" HandleID="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:46.650 [INFO][4621] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" HandleID="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00037f1b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"coredns-7c65d6cfc9-7fst7", "timestamp":"2025-07-09 15:00:46.642443633 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:46.650 [INFO][4621] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.070 [INFO][4621] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.070 [INFO][4621] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.134 [INFO][4621] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.177 [INFO][4621] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.238 [INFO][4621] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.257 [INFO][4621] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.507801 containerd[1558]: 2025-07-09 15:00:47.276 [INFO][4621] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.281 [INFO][4621] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.299 [INFO][4621] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5 Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.332 [INFO][4621] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.388 [INFO][4621] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.196/26] block=192.168.111.192/26 handle="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.389 [INFO][4621] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.196/26] handle="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.389 [INFO][4621] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:47.509689 containerd[1558]: 2025-07-09 15:00:47.389 [INFO][4621] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.196/26] IPv6=[] ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" HandleID="k8s-pod-network.ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.510004 containerd[1558]: 2025-07-09 15:00:47.402 [INFO][4431] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 57, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"coredns-7c65d6cfc9-7fst7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.111.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calieb0203028d2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.510004 containerd[1558]: 2025-07-09 15:00:47.403 [INFO][4431] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.196/32] ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.510004 containerd[1558]: 2025-07-09 15:00:47.403 [INFO][4431] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb0203028d2 ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.510004 containerd[1558]: 2025-07-09 15:00:47.427 [INFO][4431] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.510004 containerd[1558]: 2025-07-09 15:00:47.428 [INFO][4431] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9", ResourceVersion:"986", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 57, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5", Pod:"coredns-7c65d6cfc9-7fst7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.111.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calieb0203028d2", MAC:"a2:a1:cb:8c:fd:76", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.510004 containerd[1558]: 2025-07-09 15:00:47.491 [INFO][4431] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7fst7" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--7fst7-eth0" Jul 9 15:00:47.626577 systemd-networkd[1444]: cali466351894ef: Link UP Jul 9 15:00:47.632090 systemd-networkd[1444]: cali466351894ef: Gained carrier Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:46.233 [INFO][4449] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:46.281 [INFO][4449] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0 coredns-7c65d6cfc9- kube-system eb7c4fb4-78f9-47f9-87f5-24145e136152 979 0 2025-07-09 14:57:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal coredns-7c65d6cfc9-j7g9m eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali466351894ef [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:46.282 [INFO][4449] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:46.676 [INFO][4622] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" HandleID="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:46.684 [INFO][4622] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" HandleID="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003319a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"coredns-7c65d6cfc9-j7g9m", "timestamp":"2025-07-09 15:00:46.676284365 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:46.685 [INFO][4622] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.389 [INFO][4622] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.390 [INFO][4622] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.450 [INFO][4622] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.505 [INFO][4622] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.520 [INFO][4622] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.524 [INFO][4622] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.530 [INFO][4622] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.531 [INFO][4622] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.546 [INFO][4622] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954 Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.580 [INFO][4622] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.600 [INFO][4622] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.197/26] block=192.168.111.192/26 handle="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.601 [INFO][4622] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.197/26] handle="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.602 [INFO][4622] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:47.719577 containerd[1558]: 2025-07-09 15:00:47.603 [INFO][4622] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.197/26] IPv6=[] ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" HandleID="k8s-pod-network.c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.722869 containerd[1558]: 2025-07-09 15:00:47.611 [INFO][4449] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"eb7c4fb4-78f9-47f9-87f5-24145e136152", ResourceVersion:"979", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 57, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"coredns-7c65d6cfc9-j7g9m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.111.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali466351894ef", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.722869 containerd[1558]: 2025-07-09 15:00:47.612 [INFO][4449] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.197/32] ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.722869 containerd[1558]: 2025-07-09 15:00:47.613 [INFO][4449] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali466351894ef ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.722869 containerd[1558]: 2025-07-09 15:00:47.655 [INFO][4449] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.722869 containerd[1558]: 2025-07-09 15:00:47.658 [INFO][4449] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"eb7c4fb4-78f9-47f9-87f5-24145e136152", ResourceVersion:"979", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 57, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954", Pod:"coredns-7c65d6cfc9-j7g9m", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.111.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali466351894ef", MAC:"52:db:5d:0f:e7:b6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.722869 containerd[1558]: 2025-07-09 15:00:47.692 [INFO][4449] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" Namespace="kube-system" Pod="coredns-7c65d6cfc9-j7g9m" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-coredns--7c65d6cfc9--j7g9m-eth0" Jul 9 15:00:47.722869 containerd[1558]: time="2025-07-09T15:00:47.721150795Z" level=info msg="connecting to shim ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5" address="unix:///run/containerd/s/064c79f3055918d92ba298e9f29d8d35576165f046172b4d68234803e7f84889" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:47.786142 containerd[1558]: time="2025-07-09T15:00:47.786075850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sbhtt,Uid:2052df22-65ee-4914-9e6a-b1a620327c58,Namespace:calico-system,Attempt:0,} returns sandbox id \"e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec\"" Jul 9 15:00:47.795887 containerd[1558]: time="2025-07-09T15:00:47.795534108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 9 15:00:47.829104 containerd[1558]: time="2025-07-09T15:00:47.829061879Z" level=info msg="connecting to shim c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954" address="unix:///run/containerd/s/61f0f0b5d85b9a557cb4cc44edd58efb3401e339352d0fcf786114a81a6f8bc8" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:47.862867 systemd[1]: Started cri-containerd-ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5.scope - libcontainer container ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5. Jul 9 15:00:47.894847 systemd[1]: Started cri-containerd-c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954.scope - libcontainer container c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954. Jul 9 15:00:47.914838 systemd-networkd[1444]: cali819b0b6bd21: Link UP Jul 9 15:00:47.918947 systemd-networkd[1444]: cali819b0b6bd21: Gained carrier Jul 9 15:00:47.983969 containerd[1558]: time="2025-07-09T15:00:47.983908788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-2bwfx,Uid:cf44cb30-e140-42e1-a090-193656cf2eba,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f\"" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:46.288 [INFO][4433] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:46.369 [INFO][4433] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0 goldmane-58fd7646b9- calico-system 359bd74f-671b-4729-9999-52147917a66d 995 0 2025-07-09 14:58:13 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal goldmane-58fd7646b9-jhvtt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali819b0b6bd21 [] [] }} ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:46.370 [INFO][4433] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:46.710 [INFO][4634] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" HandleID="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:46.711 [INFO][4634] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" HandleID="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001032d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"goldmane-58fd7646b9-jhvtt", "timestamp":"2025-07-09 15:00:46.706960933 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:46.711 [INFO][4634] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.604 [INFO][4634] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.606 [INFO][4634] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.673 [INFO][4634] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.725 [INFO][4634] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.739 [INFO][4634] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.744 [INFO][4634] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.755 [INFO][4634] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.762 [INFO][4634] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.767 [INFO][4634] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9 Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.783 [INFO][4634] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.820 [INFO][4634] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.198/26] block=192.168.111.192/26 handle="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.827 [INFO][4634] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.198/26] handle="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.830 [INFO][4634] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:47.985320 containerd[1558]: 2025-07-09 15:00:47.830 [INFO][4634] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.198/26] IPv6=[] ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" HandleID="k8s-pod-network.a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:47.988786 containerd[1558]: 2025-07-09 15:00:47.884 [INFO][4433] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"359bd74f-671b-4729-9999-52147917a66d", ResourceVersion:"995", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"goldmane-58fd7646b9-jhvtt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.111.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali819b0b6bd21", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.988786 containerd[1558]: 2025-07-09 15:00:47.890 [INFO][4433] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.198/32] ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:47.988786 containerd[1558]: 2025-07-09 15:00:47.893 [INFO][4433] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali819b0b6bd21 ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:47.988786 containerd[1558]: 2025-07-09 15:00:47.927 [INFO][4433] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:47.988786 containerd[1558]: 2025-07-09 15:00:47.928 [INFO][4433] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"359bd74f-671b-4729-9999-52147917a66d", ResourceVersion:"995", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9", Pod:"goldmane-58fd7646b9-jhvtt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.111.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali819b0b6bd21", MAC:"2e:47:0b:ce:94:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:47.988786 containerd[1558]: 2025-07-09 15:00:47.972 [INFO][4433] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" Namespace="calico-system" Pod="goldmane-58fd7646b9-jhvtt" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-goldmane--58fd7646b9--jhvtt-eth0" Jul 9 15:00:48.091995 containerd[1558]: time="2025-07-09T15:00:48.091912822Z" level=info msg="connecting to shim a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9" address="unix:///run/containerd/s/8865673696ab0ed226747529de69a462592d3b5a20a3abf65dcf9da6880beab1" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:48.193864 containerd[1558]: time="2025-07-09T15:00:48.193802050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7fst7,Uid:bcddc1ac-72ed-4d50-bae7-0d28ffbb33e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5\"" Jul 9 15:00:48.203745 containerd[1558]: time="2025-07-09T15:00:48.203669280Z" level=info msg="CreateContainer within sandbox \"ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 9 15:00:48.227882 systemd-networkd[1444]: cali280040ee528: Gained IPv6LL Jul 9 15:00:48.241079 systemd[1]: Started cri-containerd-a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9.scope - libcontainer container a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9. Jul 9 15:00:48.265584 systemd-networkd[1444]: cali675d02213e9: Link UP Jul 9 15:00:48.271668 systemd-networkd[1444]: cali675d02213e9: Gained carrier Jul 9 15:00:48.283706 containerd[1558]: time="2025-07-09T15:00:48.283540814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-j7g9m,Uid:eb7c4fb4-78f9-47f9-87f5-24145e136152,Namespace:kube-system,Attempt:0,} returns sandbox id \"c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954\"" Jul 9 15:00:48.288179 systemd-networkd[1444]: cali068eb0fed98: Gained IPv6LL Jul 9 15:00:48.309657 containerd[1558]: time="2025-07-09T15:00:48.309597620Z" level=info msg="CreateContainer within sandbox \"c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 9 15:00:48.326856 containerd[1558]: time="2025-07-09T15:00:48.326755412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c659cb4b9-8q7q2,Uid:c0f2baca-1ed7-48f2-9669-edb27549a99b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df\"" Jul 9 15:00:48.338336 containerd[1558]: time="2025-07-09T15:00:48.336669670Z" level=info msg="Container 3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:00:48.337566 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4263558657.mount: Deactivated successfully. Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:46.274 [INFO][4455] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:46.349 [INFO][4455] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0 calico-kube-controllers-84d945fb8c- calico-system ee12b76c-10d0-472a-83d7-d8ece5511583 989 0 2025-07-09 14:58:14 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:84d945fb8c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal calico-kube-controllers-84d945fb8c-d8nqq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali675d02213e9 [] [] }} ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:46.352 [INFO][4455] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:46.730 [INFO][4632] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" HandleID="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:46.730 [INFO][4632] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" HandleID="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123d50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"calico-kube-controllers-84d945fb8c-d8nqq", "timestamp":"2025-07-09 15:00:46.730180042 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:46.731 [INFO][4632] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:47.834 [INFO][4632] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:47.870 [INFO][4632] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:47.954 [INFO][4632] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.022 [INFO][4632] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.052 [INFO][4632] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.070 [INFO][4632] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.087 [INFO][4632] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.095 [INFO][4632] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.113 [INFO][4632] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370 Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.137 [INFO][4632] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.192 [INFO][4632] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.199/26] block=192.168.111.192/26 handle="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.193 [INFO][4632] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.199/26] handle="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.195 [INFO][4632] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:48.378670 containerd[1558]: 2025-07-09 15:00:48.195 [INFO][4632] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.199/26] IPv6=[] ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" HandleID="k8s-pod-network.13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.380385 containerd[1558]: 2025-07-09 15:00:48.214 [INFO][4455] cni-plugin/k8s.go 418: Populated endpoint ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0", GenerateName:"calico-kube-controllers-84d945fb8c-", Namespace:"calico-system", SelfLink:"", UID:"ee12b76c-10d0-472a-83d7-d8ece5511583", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84d945fb8c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"calico-kube-controllers-84d945fb8c-d8nqq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.111.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali675d02213e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:48.380385 containerd[1558]: 2025-07-09 15:00:48.214 [INFO][4455] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.199/32] ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.380385 containerd[1558]: 2025-07-09 15:00:48.214 [INFO][4455] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali675d02213e9 ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.380385 containerd[1558]: 2025-07-09 15:00:48.280 [INFO][4455] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.380385 containerd[1558]: 2025-07-09 15:00:48.291 [INFO][4455] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0", GenerateName:"calico-kube-controllers-84d945fb8c-", Namespace:"calico-system", SelfLink:"", UID:"ee12b76c-10d0-472a-83d7-d8ece5511583", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 14, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"84d945fb8c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370", Pod:"calico-kube-controllers-84d945fb8c-d8nqq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.111.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali675d02213e9", MAC:"a2:e0:30:73:93:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:48.380385 containerd[1558]: 2025-07-09 15:00:48.366 [INFO][4455] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" Namespace="calico-system" Pod="calico-kube-controllers-84d945fb8c-d8nqq" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-calico--kube--controllers--84d945fb8c--d8nqq-eth0" Jul 9 15:00:48.384247 containerd[1558]: time="2025-07-09T15:00:48.383961022Z" level=info msg="Container 7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:00:48.399013 containerd[1558]: time="2025-07-09T15:00:48.398976286Z" level=info msg="CreateContainer within sandbox \"ce6edae6fbe383dc0d8f1dff24978d5eb6a29c39d652cd0f13b74643c283bce5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f\"" Jul 9 15:00:48.401516 containerd[1558]: time="2025-07-09T15:00:48.401239181Z" level=info msg="StartContainer for \"3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f\"" Jul 9 15:00:48.411526 containerd[1558]: time="2025-07-09T15:00:48.411429149Z" level=info msg="connecting to shim 3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f" address="unix:///run/containerd/s/064c79f3055918d92ba298e9f29d8d35576165f046172b4d68234803e7f84889" protocol=ttrpc version=3 Jul 9 15:00:48.424204 containerd[1558]: time="2025-07-09T15:00:48.423330102Z" level=info msg="CreateContainer within sandbox \"c1f45475e9523156bc4f5c2c905a42db83956d1e263227080a1f66199f470954\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26\"" Jul 9 15:00:48.430476 containerd[1558]: time="2025-07-09T15:00:48.430300000Z" level=info msg="StartContainer for \"7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26\"" Jul 9 15:00:48.445764 containerd[1558]: time="2025-07-09T15:00:48.445655164Z" level=info msg="connecting to shim 7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26" address="unix:///run/containerd/s/61f0f0b5d85b9a557cb4cc44edd58efb3401e339352d0fcf786114a81a6f8bc8" protocol=ttrpc version=3 Jul 9 15:00:48.490422 systemd[1]: Started cri-containerd-3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f.scope - libcontainer container 3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f. Jul 9 15:00:48.569146 systemd[1]: Started cri-containerd-7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26.scope - libcontainer container 7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26. Jul 9 15:00:48.619914 containerd[1558]: time="2025-07-09T15:00:48.618649111Z" level=info msg="connecting to shim 13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370" address="unix:///run/containerd/s/8c306beb18367e4e9a7426480a9d99d54c44e609277d983abdd62a87b38b561d" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:48.670647 systemd-networkd[1444]: calieb0203028d2: Gained IPv6LL Jul 9 15:00:48.799728 containerd[1558]: time="2025-07-09T15:00:48.798064814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-jhvtt,Uid:359bd74f-671b-4729-9999-52147917a66d,Namespace:calico-system,Attempt:0,} returns sandbox id \"a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9\"" Jul 9 15:00:48.798627 systemd-networkd[1444]: cali113ebcb3a43: Gained IPv6LL Jul 9 15:00:48.799107 systemd-networkd[1444]: cali466351894ef: Gained IPv6LL Jul 9 15:00:48.800725 systemd[1]: Started cri-containerd-13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370.scope - libcontainer container 13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370. Jul 9 15:00:48.818898 containerd[1558]: time="2025-07-09T15:00:48.818638955Z" level=info msg="StartContainer for \"3b30e208b1178f269e4eb373a0642f9c5a271fdb4dbd34a30ad1015ad9ae717f\" returns successfully" Jul 9 15:00:48.825095 containerd[1558]: time="2025-07-09T15:00:48.825007629Z" level=info msg="StartContainer for \"7e40b1b7771c6ced82da142f9da37eacb2711bda3a86727f581718721a456d26\" returns successfully" Jul 9 15:00:49.141491 containerd[1558]: time="2025-07-09T15:00:49.140274619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-84d945fb8c-d8nqq,Uid:ee12b76c-10d0-472a-83d7-d8ece5511583,Namespace:calico-system,Attempt:0,} returns sandbox id \"13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370\"" Jul 9 15:00:49.438724 systemd-networkd[1444]: cali819b0b6bd21: Gained IPv6LL Jul 9 15:00:49.502722 systemd-networkd[1444]: cali675d02213e9: Gained IPv6LL Jul 9 15:00:49.862437 kubelet[2817]: I0709 15:00:49.860648 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-7fst7" podStartSLOduration=174.860589888 podStartE2EDuration="2m54.860589888s" podCreationTimestamp="2025-07-09 14:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 15:00:49.858186719 +0000 UTC m=+179.541487450" watchObservedRunningTime="2025-07-09 15:00:49.860589888 +0000 UTC m=+179.543890619" Jul 9 15:00:49.923004 kubelet[2817]: I0709 15:00:49.921668 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-j7g9m" podStartSLOduration=174.921639402 podStartE2EDuration="2m54.921639402s" podCreationTimestamp="2025-07-09 14:57:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-09 15:00:49.921547228 +0000 UTC m=+179.604847939" watchObservedRunningTime="2025-07-09 15:00:49.921639402 +0000 UTC m=+179.604940103" Jul 9 15:00:50.024768 systemd-networkd[1444]: vxlan.calico: Link UP Jul 9 15:00:50.024778 systemd-networkd[1444]: vxlan.calico: Gained carrier Jul 9 15:00:51.678977 systemd-networkd[1444]: vxlan.calico: Gained IPv6LL Jul 9 15:00:56.130634 containerd[1558]: time="2025-07-09T15:00:56.129692512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:00:56.134566 containerd[1558]: time="2025-07-09T15:00:56.131307406Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 9 15:00:56.134566 containerd[1558]: time="2025-07-09T15:00:56.133902886Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:00:56.138524 containerd[1558]: time="2025-07-09T15:00:56.138181530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:00:56.139465 containerd[1558]: time="2025-07-09T15:00:56.139120470Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 8.343514687s" Jul 9 15:00:56.139465 containerd[1558]: time="2025-07-09T15:00:56.139186644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 9 15:00:56.142878 containerd[1558]: time="2025-07-09T15:00:56.142527040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 9 15:00:56.145112 containerd[1558]: time="2025-07-09T15:00:56.145066916Z" level=info msg="CreateContainer within sandbox \"e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 9 15:00:56.173901 containerd[1558]: time="2025-07-09T15:00:56.173842806Z" level=info msg="Container 725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:00:56.214784 containerd[1558]: time="2025-07-09T15:00:56.214715443Z" level=info msg="CreateContainer within sandbox \"e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc\"" Jul 9 15:00:56.215518 containerd[1558]: time="2025-07-09T15:00:56.215476187Z" level=info msg="StartContainer for \"725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc\"" Jul 9 15:00:56.218242 containerd[1558]: time="2025-07-09T15:00:56.218197165Z" level=info msg="connecting to shim 725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc" address="unix:///run/containerd/s/bac8065457c681a8c08deb425609cff07be402208a83fd4c2967711fff66db57" protocol=ttrpc version=3 Jul 9 15:00:56.273793 systemd[1]: Started cri-containerd-725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc.scope - libcontainer container 725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc. Jul 9 15:00:56.383067 containerd[1558]: time="2025-07-09T15:00:56.382807633Z" level=info msg="StartContainer for \"725ccf228f4d9005eed802c28099ee63481d8b134e9af3c193e3c8240ba34bfc\" returns successfully" Jul 9 15:00:56.511325 containerd[1558]: time="2025-07-09T15:00:56.511284727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b97bcbb6-ssbg8,Uid:33b985a0-8206-4e6c-9e8e-54c920a706c5,Namespace:calico-system,Attempt:0,}" Jul 9 15:00:56.800275 systemd-networkd[1444]: cali24436744a95: Link UP Jul 9 15:00:56.802519 systemd-networkd[1444]: cali24436744a95: Gained carrier Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.634 [INFO][5269] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0 whisker-5b97bcbb6- calico-system 33b985a0-8206-4e6c-9e8e-54c920a706c5 1035 0 2025-07-09 15:00:28 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5b97bcbb6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal whisker-5b97bcbb6-ssbg8 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali24436744a95 [] [] }} ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.635 [INFO][5269] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.698 [INFO][5280] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.699 [INFO][5280] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103490), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"whisker-5b97bcbb6-ssbg8", "timestamp":"2025-07-09 15:00:56.698591908 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.699 [INFO][5280] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.699 [INFO][5280] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.699 [INFO][5280] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.730 [INFO][5280] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.737 [INFO][5280] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.747 [INFO][5280] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.750 [INFO][5280] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.755 [INFO][5280] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.755 [INFO][5280] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.758 [INFO][5280] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0 Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.765 [INFO][5280] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.784 [INFO][5280] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.200/26] block=192.168.111.192/26 handle="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.785 [INFO][5280] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.200/26] handle="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.785 [INFO][5280] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:00:56.836283 containerd[1558]: 2025-07-09 15:00:56.785 [INFO][5280] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.200/26] IPv6=[] ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.839642 containerd[1558]: 2025-07-09 15:00:56.791 [INFO][5269] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0", GenerateName:"whisker-5b97bcbb6-", Namespace:"calico-system", SelfLink:"", UID:"33b985a0-8206-4e6c-9e8e-54c920a706c5", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 15, 0, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b97bcbb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"whisker-5b97bcbb6-ssbg8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.111.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali24436744a95", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:56.839642 containerd[1558]: 2025-07-09 15:00:56.791 [INFO][5269] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.200/32] ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.839642 containerd[1558]: 2025-07-09 15:00:56.791 [INFO][5269] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24436744a95 ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.839642 containerd[1558]: 2025-07-09 15:00:56.802 [INFO][5269] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.839642 containerd[1558]: 2025-07-09 15:00:56.803 [INFO][5269] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0", GenerateName:"whisker-5b97bcbb6-", Namespace:"calico-system", SelfLink:"", UID:"33b985a0-8206-4e6c-9e8e-54c920a706c5", ResourceVersion:"1035", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 15, 0, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5b97bcbb6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0", Pod:"whisker-5b97bcbb6-ssbg8", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.111.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali24436744a95", MAC:"52:31:8a:48:c5:6b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:00:56.839642 containerd[1558]: 2025-07-09 15:00:56.821 [INFO][5269] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Namespace="calico-system" Pod="whisker-5b97bcbb6-ssbg8" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:00:56.898047 containerd[1558]: time="2025-07-09T15:00:56.897991509Z" level=info msg="connecting to shim 9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" address="unix:///run/containerd/s/8c3e16d8814bc427f7afa0bc1cc53ceb5530743bab14339c99965782b9241c4a" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:00:56.938696 systemd[1]: Started cri-containerd-9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0.scope - libcontainer container 9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0. Jul 9 15:00:57.014903 containerd[1558]: time="2025-07-09T15:00:57.014852410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5b97bcbb6-ssbg8,Uid:33b985a0-8206-4e6c-9e8e-54c920a706c5,Namespace:calico-system,Attempt:0,} returns sandbox id \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\"" Jul 9 15:00:57.987003 containerd[1558]: time="2025-07-09T15:00:57.986920457Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"50bf30e692e84a15f463e9c7feed8e6cb3d71c3c458f8ae675014edeb53c8bb5\" pid:5359 exited_at:{seconds:1752073257 nanos:985925943}" Jul 9 15:00:58.463079 systemd-networkd[1444]: cali24436744a95: Gained IPv6LL Jul 9 15:01:05.796486 containerd[1558]: time="2025-07-09T15:01:05.796305662Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:05.798928 containerd[1558]: time="2025-07-09T15:01:05.798900332Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 9 15:01:05.800574 containerd[1558]: time="2025-07-09T15:01:05.800521336Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:05.804654 containerd[1558]: time="2025-07-09T15:01:05.804597856Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:05.806475 containerd[1558]: time="2025-07-09T15:01:05.806387709Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 9.663810846s" Jul 9 15:01:05.806562 containerd[1558]: time="2025-07-09T15:01:05.806444626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 9 15:01:05.809142 containerd[1558]: time="2025-07-09T15:01:05.808917746Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 9 15:01:05.813740 containerd[1558]: time="2025-07-09T15:01:05.813691340Z" level=info msg="CreateContainer within sandbox \"91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 9 15:01:05.834480 containerd[1558]: time="2025-07-09T15:01:05.831758842Z" level=info msg="Container 81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:05.840394 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2175922917.mount: Deactivated successfully. Jul 9 15:01:05.850760 containerd[1558]: time="2025-07-09T15:01:05.850670465Z" level=info msg="CreateContainer within sandbox \"91b3a523867760350e35b6fa07599d4adc29cae7c5c2bcc86aff65521cb7bc3f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84\"" Jul 9 15:01:05.851720 containerd[1558]: time="2025-07-09T15:01:05.851597281Z" level=info msg="StartContainer for \"81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84\"" Jul 9 15:01:05.859430 containerd[1558]: time="2025-07-09T15:01:05.858535534Z" level=info msg="connecting to shim 81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84" address="unix:///run/containerd/s/631500851695963479ff4db819e3079cc957c88e5a7e5e5d8ba1d6b0195fe458" protocol=ttrpc version=3 Jul 9 15:01:05.905661 systemd[1]: Started cri-containerd-81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84.scope - libcontainer container 81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84. Jul 9 15:01:05.999171 containerd[1558]: time="2025-07-09T15:01:05.999123575Z" level=info msg="StartContainer for \"81856221f383bcf060ba3820f4424391422400905a6f613c3954b5ecd22aae84\" returns successfully" Jul 9 15:01:06.356961 containerd[1558]: time="2025-07-09T15:01:06.356107658Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:06.362135 containerd[1558]: time="2025-07-09T15:01:06.360946956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 9 15:01:06.365245 containerd[1558]: time="2025-07-09T15:01:06.365199448Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 556.192764ms" Jul 9 15:01:06.365449 containerd[1558]: time="2025-07-09T15:01:06.365415425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 9 15:01:06.368927 containerd[1558]: time="2025-07-09T15:01:06.368882338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 9 15:01:06.371111 containerd[1558]: time="2025-07-09T15:01:06.371062324Z" level=info msg="CreateContainer within sandbox \"088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 9 15:01:06.399513 containerd[1558]: time="2025-07-09T15:01:06.399412289Z" level=info msg="Container 628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:06.429859 containerd[1558]: time="2025-07-09T15:01:06.429818568Z" level=info msg="CreateContainer within sandbox \"088a90d757308d0e1bb0bd89a8fbdb833473116ee4a9feca56ea4edeab81b4df\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367\"" Jul 9 15:01:06.432884 containerd[1558]: time="2025-07-09T15:01:06.431601438Z" level=info msg="StartContainer for \"628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367\"" Jul 9 15:01:06.434543 containerd[1558]: time="2025-07-09T15:01:06.434499208Z" level=info msg="connecting to shim 628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367" address="unix:///run/containerd/s/b520090a4847e61fec2db5161aa3565207466441886f6b3c335009fd232ed731" protocol=ttrpc version=3 Jul 9 15:01:06.483910 systemd[1]: Started cri-containerd-628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367.scope - libcontainer container 628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367. Jul 9 15:01:06.571281 containerd[1558]: time="2025-07-09T15:01:06.570946502Z" level=info msg="StartContainer for \"628e7ea76ca42601d8820bbbf3e19cee10e0c6e98fa2283b8d8770e1ab4d6367\" returns successfully" Jul 9 15:01:06.978897 kubelet[2817]: I0709 15:01:06.978712 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c659cb4b9-8q7q2" podStartSLOduration=160.95784951 podStartE2EDuration="2m58.978605212s" podCreationTimestamp="2025-07-09 14:58:08 +0000 UTC" firstStartedPulling="2025-07-09 15:00:48.346711499 +0000 UTC m=+178.030012210" lastFinishedPulling="2025-07-09 15:01:06.367467211 +0000 UTC m=+196.050767912" observedRunningTime="2025-07-09 15:01:06.977372669 +0000 UTC m=+196.660673400" watchObservedRunningTime="2025-07-09 15:01:06.978605212 +0000 UTC m=+196.661905923" Jul 9 15:01:07.031073 kubelet[2817]: I0709 15:01:07.030950 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c659cb4b9-2bwfx" podStartSLOduration=161.221122744 podStartE2EDuration="2m59.030926771s" podCreationTimestamp="2025-07-09 14:58:08 +0000 UTC" firstStartedPulling="2025-07-09 15:00:47.998224514 +0000 UTC m=+177.681525215" lastFinishedPulling="2025-07-09 15:01:05.808028531 +0000 UTC m=+195.491329242" observedRunningTime="2025-07-09 15:01:07.028317404 +0000 UTC m=+196.711618125" watchObservedRunningTime="2025-07-09 15:01:07.030926771 +0000 UTC m=+196.714227472" Jul 9 15:01:11.486860 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3749601618.mount: Deactivated successfully. Jul 9 15:01:12.854979 containerd[1558]: time="2025-07-09T15:01:12.854882941Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:12.858306 containerd[1558]: time="2025-07-09T15:01:12.858030652Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 9 15:01:12.860482 containerd[1558]: time="2025-07-09T15:01:12.859915884Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:12.868496 containerd[1558]: time="2025-07-09T15:01:12.867594719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:12.870475 containerd[1558]: time="2025-07-09T15:01:12.869910773Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 6.500784777s" Jul 9 15:01:12.870626 containerd[1558]: time="2025-07-09T15:01:12.870597557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 9 15:01:12.875126 containerd[1558]: time="2025-07-09T15:01:12.874199412Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 9 15:01:12.882035 containerd[1558]: time="2025-07-09T15:01:12.881993685Z" level=info msg="CreateContainer within sandbox \"a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 9 15:01:12.906504 containerd[1558]: time="2025-07-09T15:01:12.903826586Z" level=info msg="Container a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:12.911566 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3668491429.mount: Deactivated successfully. Jul 9 15:01:12.933703 containerd[1558]: time="2025-07-09T15:01:12.933645371Z" level=info msg="CreateContainer within sandbox \"a2f42ffc906174b39dda8d3826aa853638e98e0342e2e730af65b76ce49a44b9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\"" Jul 9 15:01:12.934803 containerd[1558]: time="2025-07-09T15:01:12.934771092Z" level=info msg="StartContainer for \"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\"" Jul 9 15:01:12.940061 containerd[1558]: time="2025-07-09T15:01:12.938969101Z" level=info msg="connecting to shim a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50" address="unix:///run/containerd/s/8865673696ab0ed226747529de69a462592d3b5a20a3abf65dcf9da6880beab1" protocol=ttrpc version=3 Jul 9 15:01:13.022751 systemd[1]: Started cri-containerd-a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50.scope - libcontainer container a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50. Jul 9 15:01:13.235087 containerd[1558]: time="2025-07-09T15:01:13.234956942Z" level=info msg="StartContainer for \"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" returns successfully" Jul 9 15:01:14.214745 containerd[1558]: time="2025-07-09T15:01:14.214612106Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"f634aca79e4f66fcfea33a07b0703b5994a947ff86a8c5d1bd2c71747d467849\" pid:5533 exit_status:1 exited_at:{seconds:1752073274 nanos:212051813}" Jul 9 15:01:15.240403 containerd[1558]: time="2025-07-09T15:01:15.240081080Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"371862c96c531d878acf7fb7f70c185b404d2b66185252eaa8966497c7454b2a\" pid:5559 exit_status:1 exited_at:{seconds:1752073275 nanos:238701410}" Jul 9 15:01:21.312559 containerd[1558]: time="2025-07-09T15:01:21.312414976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:21.315951 containerd[1558]: time="2025-07-09T15:01:21.315467435Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 9 15:01:21.317965 containerd[1558]: time="2025-07-09T15:01:21.317113315Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:21.322953 containerd[1558]: time="2025-07-09T15:01:21.322416935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:21.323349 containerd[1558]: time="2025-07-09T15:01:21.323304557Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 8.449058206s" Jul 9 15:01:21.323532 containerd[1558]: time="2025-07-09T15:01:21.323365171Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 9 15:01:21.326507 containerd[1558]: time="2025-07-09T15:01:21.326221462Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 9 15:01:21.353875 containerd[1558]: time="2025-07-09T15:01:21.353617555Z" level=info msg="CreateContainer within sandbox \"13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 9 15:01:21.383303 containerd[1558]: time="2025-07-09T15:01:21.383250561Z" level=info msg="Container 8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:21.405768 containerd[1558]: time="2025-07-09T15:01:21.405719854Z" level=info msg="CreateContainer within sandbox \"13d384beba9fbf38002ca91f3d66906c940d57c5204cc2c255728cf94ad0b370\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\"" Jul 9 15:01:21.407783 containerd[1558]: time="2025-07-09T15:01:21.407696667Z" level=info msg="StartContainer for \"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\"" Jul 9 15:01:21.410663 containerd[1558]: time="2025-07-09T15:01:21.410547547Z" level=info msg="connecting to shim 8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d" address="unix:///run/containerd/s/8c306beb18367e4e9a7426480a9d99d54c44e609277d983abdd62a87b38b561d" protocol=ttrpc version=3 Jul 9 15:01:21.476767 systemd[1]: Started cri-containerd-8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d.scope - libcontainer container 8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d. Jul 9 15:01:21.691878 containerd[1558]: time="2025-07-09T15:01:21.691807575Z" level=info msg="StartContainer for \"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" returns successfully" Jul 9 15:01:22.119325 kubelet[2817]: I0709 15:01:22.118440 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-84d945fb8c-d8nqq" podStartSLOduration=155.937634728 podStartE2EDuration="3m8.118297141s" podCreationTimestamp="2025-07-09 14:58:14 +0000 UTC" firstStartedPulling="2025-07-09 15:00:49.14520391 +0000 UTC m=+178.828504611" lastFinishedPulling="2025-07-09 15:01:21.325866323 +0000 UTC m=+211.009167024" observedRunningTime="2025-07-09 15:01:22.117704034 +0000 UTC m=+211.801004755" watchObservedRunningTime="2025-07-09 15:01:22.118297141 +0000 UTC m=+211.801597842" Jul 9 15:01:22.121746 kubelet[2817]: I0709 15:01:22.120745 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-jhvtt" podStartSLOduration=165.063072784 podStartE2EDuration="3m9.120622792s" podCreationTimestamp="2025-07-09 14:58:13 +0000 UTC" firstStartedPulling="2025-07-09 15:00:48.815742906 +0000 UTC m=+178.499043607" lastFinishedPulling="2025-07-09 15:01:12.873292904 +0000 UTC m=+202.556593615" observedRunningTime="2025-07-09 15:01:14.039182117 +0000 UTC m=+203.722482848" watchObservedRunningTime="2025-07-09 15:01:22.120622792 +0000 UTC m=+211.803923493" Jul 9 15:01:22.302893 containerd[1558]: time="2025-07-09T15:01:22.302679606Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"216b35eb3ca2d4cae7e3687f674be1af6bcdd465ceb3cbc876c7d7d556a35c60\" pid:5631 exited_at:{seconds:1752073282 nanos:302153365}" Jul 9 15:01:24.532654 containerd[1558]: time="2025-07-09T15:01:24.532594642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:24.534910 containerd[1558]: time="2025-07-09T15:01:24.534583398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 9 15:01:24.536494 containerd[1558]: time="2025-07-09T15:01:24.536340878Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:24.543227 containerd[1558]: time="2025-07-09T15:01:24.542591551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:24.543545 containerd[1558]: time="2025-07-09T15:01:24.543183657Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 3.216921589s" Jul 9 15:01:24.543625 containerd[1558]: time="2025-07-09T15:01:24.543548193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 9 15:01:24.547490 containerd[1558]: time="2025-07-09T15:01:24.546294887Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 9 15:01:24.548580 containerd[1558]: time="2025-07-09T15:01:24.548545426Z" level=info msg="CreateContainer within sandbox \"e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 9 15:01:24.572092 containerd[1558]: time="2025-07-09T15:01:24.570870565Z" level=info msg="Container 46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:24.579914 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2370129155.mount: Deactivated successfully. Jul 9 15:01:24.601896 containerd[1558]: time="2025-07-09T15:01:24.601822713Z" level=info msg="CreateContainer within sandbox \"e4864cf565511b5f4785a98788c8a1168454a15cb377ab6ca332f960d829faec\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793\"" Jul 9 15:01:24.603526 containerd[1558]: time="2025-07-09T15:01:24.603496195Z" level=info msg="StartContainer for \"46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793\"" Jul 9 15:01:24.607889 containerd[1558]: time="2025-07-09T15:01:24.607847841Z" level=info msg="connecting to shim 46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793" address="unix:///run/containerd/s/bac8065457c681a8c08deb425609cff07be402208a83fd4c2967711fff66db57" protocol=ttrpc version=3 Jul 9 15:01:24.658729 systemd[1]: Started cri-containerd-46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793.scope - libcontainer container 46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793. Jul 9 15:01:24.955361 containerd[1558]: time="2025-07-09T15:01:24.955317158Z" level=info msg="StartContainer for \"46c0d2107b7152b4237a20e0f60cccec8dfcb94ab3aefbee91bc804bf7a5b793\" returns successfully" Jul 9 15:01:25.600507 kubelet[2817]: I0709 15:01:25.599055 2817 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 9 15:01:25.600507 kubelet[2817]: I0709 15:01:25.599158 2817 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 9 15:01:27.043165 containerd[1558]: time="2025-07-09T15:01:27.042961331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"2774e38681aede72402eca4bc623a8c326f96f18b05a8189ee3d7e4e729c1644\" pid:5703 exited_at:{seconds:1752073286 nanos:907443172}" Jul 9 15:01:27.043165 containerd[1558]: time="2025-07-09T15:01:27.043020442Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"fb60aa433c505c73f15bba9097b25d643c294de23c2a4c0d1d57d444d80ca706\" pid:5729 exited_at:{seconds:1752073286 nanos:832835684}" Jul 9 15:01:27.146902 kubelet[2817]: I0709 15:01:27.146794 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-sbhtt" podStartSLOduration=156.393522218 podStartE2EDuration="3m13.146776016s" podCreationTimestamp="2025-07-09 14:58:14 +0000 UTC" firstStartedPulling="2025-07-09 15:00:47.79279748 +0000 UTC m=+177.476098181" lastFinishedPulling="2025-07-09 15:01:24.546051268 +0000 UTC m=+214.229351979" observedRunningTime="2025-07-09 15:01:25.09979599 +0000 UTC m=+214.783096731" watchObservedRunningTime="2025-07-09 15:01:27.146776016 +0000 UTC m=+216.830076727" Jul 9 15:01:27.166566 containerd[1558]: time="2025-07-09T15:01:27.166512828Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"bf3538d52eef3a46992615ae4c54b04fe5e42f2999e7c4fe86e3829ced97b6a6\" pid:5699 exited_at:{seconds:1752073287 nanos:161350716}" Jul 9 15:01:27.933233 containerd[1558]: time="2025-07-09T15:01:27.932992470Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"94ecc0c529e8e87b210b98651b645c5503f36fd64a82fd04d5dd616f69b741bd\" pid:5759 exited_at:{seconds:1752073287 nanos:932167156}" Jul 9 15:01:30.338542 containerd[1558]: time="2025-07-09T15:01:30.336442737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:30.350032 containerd[1558]: time="2025-07-09T15:01:30.349941960Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 9 15:01:30.372517 containerd[1558]: time="2025-07-09T15:01:30.371868856Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:30.391493 containerd[1558]: time="2025-07-09T15:01:30.390881252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:30.392853 containerd[1558]: time="2025-07-09T15:01:30.392768487Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 5.84635204s" Jul 9 15:01:30.393075 containerd[1558]: time="2025-07-09T15:01:30.393044767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 9 15:01:30.414292 containerd[1558]: time="2025-07-09T15:01:30.414221399Z" level=info msg="CreateContainer within sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 9 15:01:30.492667 containerd[1558]: time="2025-07-09T15:01:30.491946410Z" level=info msg="Container 67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:30.569861 containerd[1558]: time="2025-07-09T15:01:30.569603413Z" level=info msg="CreateContainer within sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\"" Jul 9 15:01:30.572909 containerd[1558]: time="2025-07-09T15:01:30.572676651Z" level=info msg="StartContainer for \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\"" Jul 9 15:01:30.576643 containerd[1558]: time="2025-07-09T15:01:30.576589861Z" level=info msg="connecting to shim 67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4" address="unix:///run/containerd/s/8c3e16d8814bc427f7afa0bc1cc53ceb5530743bab14339c99965782b9241c4a" protocol=ttrpc version=3 Jul 9 15:01:30.620925 systemd[1]: Started cri-containerd-67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4.scope - libcontainer container 67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4. Jul 9 15:01:30.778655 containerd[1558]: time="2025-07-09T15:01:30.778524268Z" level=info msg="StartContainer for \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" returns successfully" Jul 9 15:01:30.781963 containerd[1558]: time="2025-07-09T15:01:30.781916848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 9 15:01:36.991324 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3731764923.mount: Deactivated successfully. Jul 9 15:01:37.089272 containerd[1558]: time="2025-07-09T15:01:37.089166094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:37.092702 containerd[1558]: time="2025-07-09T15:01:37.092344880Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 9 15:01:37.095856 containerd[1558]: time="2025-07-09T15:01:37.095775982Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:37.104479 containerd[1558]: time="2025-07-09T15:01:37.103482664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 9 15:01:37.107473 containerd[1558]: time="2025-07-09T15:01:37.106873018Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 6.324861504s" Jul 9 15:01:37.107473 containerd[1558]: time="2025-07-09T15:01:37.106940266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 9 15:01:37.115868 containerd[1558]: time="2025-07-09T15:01:37.115779491Z" level=info msg="CreateContainer within sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 9 15:01:37.163155 containerd[1558]: time="2025-07-09T15:01:37.163047854Z" level=info msg="Container 9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:37.175735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3995821773.mount: Deactivated successfully. Jul 9 15:01:37.196807 containerd[1558]: time="2025-07-09T15:01:37.196745601Z" level=info msg="CreateContainer within sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\"" Jul 9 15:01:37.199715 containerd[1558]: time="2025-07-09T15:01:37.199684135Z" level=info msg="StartContainer for \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\"" Jul 9 15:01:37.202495 containerd[1558]: time="2025-07-09T15:01:37.201965411Z" level=info msg="connecting to shim 9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef" address="unix:///run/containerd/s/8c3e16d8814bc427f7afa0bc1cc53ceb5530743bab14339c99965782b9241c4a" protocol=ttrpc version=3 Jul 9 15:01:37.247717 systemd[1]: Started cri-containerd-9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef.scope - libcontainer container 9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef. Jul 9 15:01:37.389562 containerd[1558]: time="2025-07-09T15:01:37.388433075Z" level=info msg="StartContainer for \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" returns successfully" Jul 9 15:01:38.179722 containerd[1558]: time="2025-07-09T15:01:38.178038427Z" level=info msg="StopContainer for \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" with timeout 30 (s)" Jul 9 15:01:38.195928 containerd[1558]: time="2025-07-09T15:01:38.195871689Z" level=info msg="Stop container \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" with signal terminated" Jul 9 15:01:38.196647 containerd[1558]: time="2025-07-09T15:01:38.196103085Z" level=info msg="StopContainer for \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" with timeout 30 (s)" Jul 9 15:01:38.197846 containerd[1558]: time="2025-07-09T15:01:38.197796614Z" level=info msg="Stop container \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" with signal terminated" Jul 9 15:01:38.215816 systemd[1]: cri-containerd-9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef.scope: Deactivated successfully. Jul 9 15:01:38.225772 containerd[1558]: time="2025-07-09T15:01:38.225730246Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" id:\"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" pid:5834 exit_status:2 exited_at:{seconds:1752073298 nanos:225333629}" Jul 9 15:01:38.227758 containerd[1558]: time="2025-07-09T15:01:38.227660912Z" level=info msg="received exit event container_id:\"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" id:\"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" pid:5834 exit_status:2 exited_at:{seconds:1752073298 nanos:225333629}" Jul 9 15:01:38.233670 kubelet[2817]: I0709 15:01:38.233392 2817 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5b97bcbb6-ssbg8" podStartSLOduration=30.141056236 podStartE2EDuration="1m10.233231101s" podCreationTimestamp="2025-07-09 15:00:28 +0000 UTC" firstStartedPulling="2025-07-09 15:00:57.017812058 +0000 UTC m=+186.701112769" lastFinishedPulling="2025-07-09 15:01:37.109986542 +0000 UTC m=+226.793287634" observedRunningTime="2025-07-09 15:01:38.221048479 +0000 UTC m=+227.904349180" watchObservedRunningTime="2025-07-09 15:01:38.233231101 +0000 UTC m=+227.916531802" Jul 9 15:01:38.274365 systemd[1]: cri-containerd-67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4.scope: Deactivated successfully. Jul 9 15:01:38.282125 containerd[1558]: time="2025-07-09T15:01:38.281771547Z" level=info msg="received exit event container_id:\"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" id:\"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" pid:5787 exited_at:{seconds:1752073298 nanos:280769581}" Jul 9 15:01:38.282125 containerd[1558]: time="2025-07-09T15:01:38.282089387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" id:\"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" pid:5787 exited_at:{seconds:1752073298 nanos:280769581}" Jul 9 15:01:38.287938 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef-rootfs.mount: Deactivated successfully. Jul 9 15:01:38.354936 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4-rootfs.mount: Deactivated successfully. Jul 9 15:01:39.025484 containerd[1558]: time="2025-07-09T15:01:39.022704307Z" level=info msg="StopContainer for \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" returns successfully" Jul 9 15:01:39.028789 containerd[1558]: time="2025-07-09T15:01:39.028752037Z" level=info msg="StopContainer for \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" returns successfully" Jul 9 15:01:39.029906 containerd[1558]: time="2025-07-09T15:01:39.029844533Z" level=info msg="StopPodSandbox for \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\"" Jul 9 15:01:39.030128 containerd[1558]: time="2025-07-09T15:01:39.030084195Z" level=info msg="Container to stop \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 9 15:01:39.030193 containerd[1558]: time="2025-07-09T15:01:39.030130482Z" level=info msg="Container to stop \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 9 15:01:39.052770 systemd[1]: cri-containerd-9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0.scope: Deactivated successfully. Jul 9 15:01:39.054192 containerd[1558]: time="2025-07-09T15:01:39.054144703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" id:\"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" pid:5331 exit_status:137 exited_at:{seconds:1752073299 nanos:52999446}" Jul 9 15:01:39.139651 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0-rootfs.mount: Deactivated successfully. Jul 9 15:01:39.151807 containerd[1558]: time="2025-07-09T15:01:39.151552971Z" level=info msg="shim disconnected" id=9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0 namespace=k8s.io Jul 9 15:01:39.151807 containerd[1558]: time="2025-07-09T15:01:39.151590182Z" level=warning msg="cleaning up after shim disconnected" id=9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0 namespace=k8s.io Jul 9 15:01:39.151807 containerd[1558]: time="2025-07-09T15:01:39.151599519Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 9 15:01:39.202576 containerd[1558]: time="2025-07-09T15:01:39.202099665Z" level=info msg="received exit event sandbox_id:\"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" exit_status:137 exited_at:{seconds:1752073299 nanos:52999446}" Jul 9 15:01:39.210163 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0-shm.mount: Deactivated successfully. Jul 9 15:01:39.321610 systemd-networkd[1444]: cali24436744a95: Link DOWN Jul 9 15:01:39.321621 systemd-networkd[1444]: cali24436744a95: Lost carrier Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.313 [INFO][5943] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.314 [INFO][5943] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" iface="eth0" netns="/var/run/netns/cni-9f08cf96-3765-5881-1f8f-c91b7e5db409" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.317 [INFO][5943] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" iface="eth0" netns="/var/run/netns/cni-9f08cf96-3765-5881-1f8f-c91b7e5db409" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.333 [INFO][5943] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" after=16.953896ms iface="eth0" netns="/var/run/netns/cni-9f08cf96-3765-5881-1f8f-c91b7e5db409" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.333 [INFO][5943] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.333 [INFO][5943] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.407 [INFO][5951] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.408 [INFO][5951] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.408 [INFO][5951] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.485 [INFO][5951] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.485 [INFO][5951] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.489 [INFO][5951] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:01:39.495558 containerd[1558]: 2025-07-09 15:01:39.491 [INFO][5943] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:39.497980 containerd[1558]: time="2025-07-09T15:01:39.497298771Z" level=info msg="TearDown network for sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" successfully" Jul 9 15:01:39.497980 containerd[1558]: time="2025-07-09T15:01:39.497373282Z" level=info msg="StopPodSandbox for \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" returns successfully" Jul 9 15:01:39.502663 systemd[1]: run-netns-cni\x2d9f08cf96\x2d3765\x2d5881\x2d1f8f\x2dc91b7e5db409.mount: Deactivated successfully. Jul 9 15:01:39.647877 kubelet[2817]: I0709 15:01:39.647782 2817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj7fb\" (UniqueName: \"kubernetes.io/projected/33b985a0-8206-4e6c-9e8e-54c920a706c5-kube-api-access-nj7fb\") pod \"33b985a0-8206-4e6c-9e8e-54c920a706c5\" (UID: \"33b985a0-8206-4e6c-9e8e-54c920a706c5\") " Jul 9 15:01:39.647877 kubelet[2817]: I0709 15:01:39.647862 2817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-ca-bundle\") pod \"33b985a0-8206-4e6c-9e8e-54c920a706c5\" (UID: \"33b985a0-8206-4e6c-9e8e-54c920a706c5\") " Jul 9 15:01:39.647877 kubelet[2817]: I0709 15:01:39.647887 2817 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-backend-key-pair\") pod \"33b985a0-8206-4e6c-9e8e-54c920a706c5\" (UID: \"33b985a0-8206-4e6c-9e8e-54c920a706c5\") " Jul 9 15:01:39.653179 kubelet[2817]: I0709 15:01:39.653096 2817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "33b985a0-8206-4e6c-9e8e-54c920a706c5" (UID: "33b985a0-8206-4e6c-9e8e-54c920a706c5"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 9 15:01:39.659684 systemd[1]: var-lib-kubelet-pods-33b985a0\x2d8206\x2d4e6c\x2d9e8e\x2d54c920a706c5-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 9 15:01:39.660705 kubelet[2817]: I0709 15:01:39.660105 2817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "33b985a0-8206-4e6c-9e8e-54c920a706c5" (UID: "33b985a0-8206-4e6c-9e8e-54c920a706c5"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 9 15:01:39.663836 kubelet[2817]: I0709 15:01:39.663782 2817 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b985a0-8206-4e6c-9e8e-54c920a706c5-kube-api-access-nj7fb" (OuterVolumeSpecName: "kube-api-access-nj7fb") pod "33b985a0-8206-4e6c-9e8e-54c920a706c5" (UID: "33b985a0-8206-4e6c-9e8e-54c920a706c5"). InnerVolumeSpecName "kube-api-access-nj7fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 9 15:01:39.666096 systemd[1]: var-lib-kubelet-pods-33b985a0\x2d8206\x2d4e6c\x2d9e8e\x2d54c920a706c5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnj7fb.mount: Deactivated successfully. Jul 9 15:01:39.748802 kubelet[2817]: I0709 15:01:39.748641 2817 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-backend-key-pair\") on node \"ci-9999-9-100-bf645a1a30.novalocal\" DevicePath \"\"" Jul 9 15:01:39.748802 kubelet[2817]: I0709 15:01:39.748744 2817 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj7fb\" (UniqueName: \"kubernetes.io/projected/33b985a0-8206-4e6c-9e8e-54c920a706c5-kube-api-access-nj7fb\") on node \"ci-9999-9-100-bf645a1a30.novalocal\" DevicePath \"\"" Jul 9 15:01:39.748802 kubelet[2817]: I0709 15:01:39.748760 2817 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33b985a0-8206-4e6c-9e8e-54c920a706c5-whisker-ca-bundle\") on node \"ci-9999-9-100-bf645a1a30.novalocal\" DevicePath \"\"" Jul 9 15:01:40.206206 kubelet[2817]: I0709 15:01:40.205613 2817 scope.go:117] "RemoveContainer" containerID="9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef" Jul 9 15:01:40.217273 containerd[1558]: time="2025-07-09T15:01:40.217216836Z" level=info msg="RemoveContainer for \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\"" Jul 9 15:01:40.232600 systemd[1]: Removed slice kubepods-besteffort-pod33b985a0_8206_4e6c_9e8e_54c920a706c5.slice - libcontainer container kubepods-besteffort-pod33b985a0_8206_4e6c_9e8e_54c920a706c5.slice. Jul 9 15:01:40.239845 containerd[1558]: time="2025-07-09T15:01:40.239757201Z" level=info msg="RemoveContainer for \"9bc73099bd7621997ca406e70bc30b8ec1f72d845329b70879e281816c4be8ef\" returns successfully" Jul 9 15:01:40.263369 kubelet[2817]: I0709 15:01:40.263311 2817 scope.go:117] "RemoveContainer" containerID="67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4" Jul 9 15:01:40.267254 containerd[1558]: time="2025-07-09T15:01:40.267211429Z" level=info msg="RemoveContainer for \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\"" Jul 9 15:01:40.287049 containerd[1558]: time="2025-07-09T15:01:40.286984183Z" level=info msg="RemoveContainer for \"67024fb29c54368da6c756d322ed0c9a8888a7f36ee648610344a9ee63f113e4\" returns successfully" Jul 9 15:01:40.374121 kubelet[2817]: E0709 15:01:40.374067 2817 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="33b985a0-8206-4e6c-9e8e-54c920a706c5" containerName="whisker-backend" Jul 9 15:01:40.374753 kubelet[2817]: E0709 15:01:40.374274 2817 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="33b985a0-8206-4e6c-9e8e-54c920a706c5" containerName="whisker" Jul 9 15:01:40.375550 kubelet[2817]: I0709 15:01:40.375223 2817 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b985a0-8206-4e6c-9e8e-54c920a706c5" containerName="whisker-backend" Jul 9 15:01:40.375550 kubelet[2817]: I0709 15:01:40.375251 2817 memory_manager.go:354] "RemoveStaleState removing state" podUID="33b985a0-8206-4e6c-9e8e-54c920a706c5" containerName="whisker" Jul 9 15:01:40.388116 systemd[1]: Created slice kubepods-besteffort-pod4942d732_70c5_40ff_a509_23bd6295a80e.slice - libcontainer container kubepods-besteffort-pod4942d732_70c5_40ff_a509_23bd6295a80e.slice. Jul 9 15:01:40.515618 kubelet[2817]: I0709 15:01:40.515435 2817 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b985a0-8206-4e6c-9e8e-54c920a706c5" path="/var/lib/kubelet/pods/33b985a0-8206-4e6c-9e8e-54c920a706c5/volumes" Jul 9 15:01:40.556087 kubelet[2817]: I0709 15:01:40.556034 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4942d732-70c5-40ff-a509-23bd6295a80e-whisker-ca-bundle\") pod \"whisker-58675b49d-tl5xz\" (UID: \"4942d732-70c5-40ff-a509-23bd6295a80e\") " pod="calico-system/whisker-58675b49d-tl5xz" Jul 9 15:01:40.556087 kubelet[2817]: I0709 15:01:40.556081 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trtpt\" (UniqueName: \"kubernetes.io/projected/4942d732-70c5-40ff-a509-23bd6295a80e-kube-api-access-trtpt\") pod \"whisker-58675b49d-tl5xz\" (UID: \"4942d732-70c5-40ff-a509-23bd6295a80e\") " pod="calico-system/whisker-58675b49d-tl5xz" Jul 9 15:01:40.556087 kubelet[2817]: I0709 15:01:40.556110 2817 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4942d732-70c5-40ff-a509-23bd6295a80e-whisker-backend-key-pair\") pod \"whisker-58675b49d-tl5xz\" (UID: \"4942d732-70c5-40ff-a509-23bd6295a80e\") " pod="calico-system/whisker-58675b49d-tl5xz" Jul 9 15:01:40.697976 containerd[1558]: time="2025-07-09T15:01:40.697919150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58675b49d-tl5xz,Uid:4942d732-70c5-40ff-a509-23bd6295a80e,Namespace:calico-system,Attempt:0,}" Jul 9 15:01:40.914778 systemd-networkd[1444]: calib2618bc9c2a: Link UP Jul 9 15:01:40.915853 systemd-networkd[1444]: calib2618bc9c2a: Gained carrier Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.787 [INFO][5977] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0 whisker-58675b49d- calico-system 4942d732-70c5-40ff-a509-23bd6295a80e 1323 0 2025-07-09 15:01:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:58675b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-9999-9-100-bf645a1a30.novalocal whisker-58675b49d-tl5xz eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib2618bc9c2a [] [] }} ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.787 [INFO][5977] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.839 [INFO][5986] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" HandleID="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.840 [INFO][5986] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" HandleID="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-9-100-bf645a1a30.novalocal", "pod":"whisker-58675b49d-tl5xz", "timestamp":"2025-07-09 15:01:40.83989487 +0000 UTC"}, Hostname:"ci-9999-9-100-bf645a1a30.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.840 [INFO][5986] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.841 [INFO][5986] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.841 [INFO][5986] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-9-100-bf645a1a30.novalocal' Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.855 [INFO][5986] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.866 [INFO][5986] ipam/ipam.go 394: Looking up existing affinities for host host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.875 [INFO][5986] ipam/ipam.go 511: Trying affinity for 192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.879 [INFO][5986] ipam/ipam.go 158: Attempting to load block cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.884 [INFO][5986] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.111.192/26 host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.884 [INFO][5986] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.111.192/26 handle="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.886 [INFO][5986] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23 Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.895 [INFO][5986] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.111.192/26 handle="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.905 [INFO][5986] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.111.201/26] block=192.168.111.192/26 handle="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.906 [INFO][5986] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.111.201/26] handle="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" host="ci-9999-9-100-bf645a1a30.novalocal" Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.906 [INFO][5986] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:01:40.942335 containerd[1558]: 2025-07-09 15:01:40.906 [INFO][5986] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.111.201/26] IPv6=[] ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" HandleID="k8s-pod-network.06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.944785 containerd[1558]: 2025-07-09 15:01:40.908 [INFO][5977] cni-plugin/k8s.go 418: Populated endpoint ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0", GenerateName:"whisker-58675b49d-", Namespace:"calico-system", SelfLink:"", UID:"4942d732-70c5-40ff-a509-23bd6295a80e", ResourceVersion:"1323", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 15, 1, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58675b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"", Pod:"whisker-58675b49d-tl5xz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.111.201/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib2618bc9c2a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:01:40.944785 containerd[1558]: 2025-07-09 15:01:40.908 [INFO][5977] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.111.201/32] ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.944785 containerd[1558]: 2025-07-09 15:01:40.908 [INFO][5977] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2618bc9c2a ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.944785 containerd[1558]: 2025-07-09 15:01:40.916 [INFO][5977] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.944785 containerd[1558]: 2025-07-09 15:01:40.918 [INFO][5977] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0", GenerateName:"whisker-58675b49d-", Namespace:"calico-system", SelfLink:"", UID:"4942d732-70c5-40ff-a509-23bd6295a80e", ResourceVersion:"1323", Generation:0, CreationTimestamp:time.Date(2025, time.July, 9, 15, 1, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"58675b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-9-100-bf645a1a30.novalocal", ContainerID:"06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23", Pod:"whisker-58675b49d-tl5xz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.111.201/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib2618bc9c2a", MAC:"22:a4:93:73:52:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 9 15:01:40.944785 containerd[1558]: 2025-07-09 15:01:40.935 [INFO][5977] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" Namespace="calico-system" Pod="whisker-58675b49d-tl5xz" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--58675b49d--tl5xz-eth0" Jul 9 15:01:40.987429 containerd[1558]: time="2025-07-09T15:01:40.985584252Z" level=info msg="connecting to shim 06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23" address="unix:///run/containerd/s/92008ad6993d5a313ce60bd944287896ac69f8632c5acfa5eb95297b4dc0251e" namespace=k8s.io protocol=ttrpc version=3 Jul 9 15:01:41.032805 systemd[1]: Started cri-containerd-06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23.scope - libcontainer container 06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23. Jul 9 15:01:41.141821 containerd[1558]: time="2025-07-09T15:01:41.141666803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58675b49d-tl5xz,Uid:4942d732-70c5-40ff-a509-23bd6295a80e,Namespace:calico-system,Attempt:0,} returns sandbox id \"06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23\"" Jul 9 15:01:41.148089 containerd[1558]: time="2025-07-09T15:01:41.148039152Z" level=info msg="CreateContainer within sandbox \"06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 9 15:01:41.165194 containerd[1558]: time="2025-07-09T15:01:41.164488566Z" level=info msg="Container e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:41.199249 containerd[1558]: time="2025-07-09T15:01:41.199194730Z" level=info msg="CreateContainer within sandbox \"06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc\"" Jul 9 15:01:41.202366 containerd[1558]: time="2025-07-09T15:01:41.202261505Z" level=info msg="StartContainer for \"e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc\"" Jul 9 15:01:41.205046 containerd[1558]: time="2025-07-09T15:01:41.204878203Z" level=info msg="connecting to shim e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc" address="unix:///run/containerd/s/92008ad6993d5a313ce60bd944287896ac69f8632c5acfa5eb95297b4dc0251e" protocol=ttrpc version=3 Jul 9 15:01:41.245821 systemd[1]: Started cri-containerd-e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc.scope - libcontainer container e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc. Jul 9 15:01:41.380058 containerd[1558]: time="2025-07-09T15:01:41.379942195Z" level=info msg="StartContainer for \"e994d52737fac98499bb5c78650dc58769c8707b39ad355cebfd5f6207656ffc\" returns successfully" Jul 9 15:01:41.386002 containerd[1558]: time="2025-07-09T15:01:41.385946512Z" level=info msg="CreateContainer within sandbox \"06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 9 15:01:41.429005 containerd[1558]: time="2025-07-09T15:01:41.428830925Z" level=info msg="Container a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5: CDI devices from CRI Config.CDIDevices: []" Jul 9 15:01:41.468946 containerd[1558]: time="2025-07-09T15:01:41.468862826Z" level=info msg="CreateContainer within sandbox \"06044a05d76083e70479ae81dbc6dddabf0015dba73797dc23529a06d0f5ca23\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5\"" Jul 9 15:01:41.471706 containerd[1558]: time="2025-07-09T15:01:41.470009736Z" level=info msg="StartContainer for \"a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5\"" Jul 9 15:01:41.474441 containerd[1558]: time="2025-07-09T15:01:41.474391878Z" level=info msg="connecting to shim a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5" address="unix:///run/containerd/s/92008ad6993d5a313ce60bd944287896ac69f8632c5acfa5eb95297b4dc0251e" protocol=ttrpc version=3 Jul 9 15:01:41.515000 systemd[1]: Started cri-containerd-a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5.scope - libcontainer container a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5. Jul 9 15:01:41.739082 containerd[1558]: time="2025-07-09T15:01:41.738200571Z" level=info msg="StartContainer for \"a26a56ff867c05c0ad5edcb5b7d6c09d1ce7ea8cabc05fe137870afc2e954af5\" returns successfully" Jul 9 15:01:42.430629 systemd-networkd[1444]: calib2618bc9c2a: Gained IPv6LL Jul 9 15:01:50.569156 containerd[1558]: time="2025-07-09T15:01:50.568908184Z" level=info msg="StopPodSandbox for \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\"" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.642 [WARNING][6126] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.642 [INFO][6126] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.642 [INFO][6126] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" iface="eth0" netns="" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.642 [INFO][6126] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.642 [INFO][6126] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.692 [INFO][6133] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.693 [INFO][6133] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.693 [INFO][6133] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.707 [WARNING][6133] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.707 [INFO][6133] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.710 [INFO][6133] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:01:50.714424 containerd[1558]: 2025-07-09 15:01:50.712 [INFO][6126] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.715285 containerd[1558]: time="2025-07-09T15:01:50.715134626Z" level=info msg="TearDown network for sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" successfully" Jul 9 15:01:50.715285 containerd[1558]: time="2025-07-09T15:01:50.715163501Z" level=info msg="StopPodSandbox for \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" returns successfully" Jul 9 15:01:50.716929 containerd[1558]: time="2025-07-09T15:01:50.716849936Z" level=info msg="RemovePodSandbox for \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\"" Jul 9 15:01:50.716929 containerd[1558]: time="2025-07-09T15:01:50.716899489Z" level=info msg="Forcibly stopping sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\"" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.804 [WARNING][6147] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" WorkloadEndpoint="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.804 [INFO][6147] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.804 [INFO][6147] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" iface="eth0" netns="" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.804 [INFO][6147] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.804 [INFO][6147] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.839 [INFO][6155] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.839 [INFO][6155] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.839 [INFO][6155] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.855 [WARNING][6155] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.855 [INFO][6155] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" HandleID="k8s-pod-network.9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Workload="ci--9999--9--100--bf645a1a30.novalocal-k8s-whisker--5b97bcbb6--ssbg8-eth0" Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.858 [INFO][6155] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 9 15:01:50.863635 containerd[1558]: 2025-07-09 15:01:50.861 [INFO][6147] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0" Jul 9 15:01:50.865314 containerd[1558]: time="2025-07-09T15:01:50.864596310Z" level=info msg="TearDown network for sandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" successfully" Jul 9 15:01:50.869528 containerd[1558]: time="2025-07-09T15:01:50.869465938Z" level=info msg="Ensure that sandbox 9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0 in task-service has been cleanup successfully" Jul 9 15:01:51.243486 containerd[1558]: time="2025-07-09T15:01:51.241816724Z" level=info msg="RemovePodSandbox \"9092ceea0ebbfe1625348f8b18af1a82b8e9a6e01755727c31f35dacc46aaca0\" returns successfully" Jul 9 15:01:56.728097 containerd[1558]: time="2025-07-09T15:01:56.728001770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"6021a7d450f51afa364aa9c4f7c52ec9ff01af2116c09743cc825831b588213d\" pid:6179 exited_at:{seconds:1752073316 nanos:726396978}" Jul 9 15:01:56.834186 containerd[1558]: time="2025-07-09T15:01:56.834119421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"b3bd28fbd8be5722588ce453336d265881018416f974cb9e0f7ccb91de49f324\" pid:6202 exited_at:{seconds:1752073316 nanos:832699707}" Jul 9 15:01:57.933243 containerd[1558]: time="2025-07-09T15:01:57.933195168Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"4032189169e7b2e526f5685c18d8a4bfb9ac43e9074f156095d931c4fa6d52ef\" pid:6222 exited_at:{seconds:1752073317 nanos:932706098}" Jul 9 15:02:26.872018 containerd[1558]: time="2025-07-09T15:02:26.870489158Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"e68dfb388a4eb37ab1e6254364d6b89972b3677390fee3131a1b8f946840c4c6\" pid:6279 exited_at:{seconds:1752073346 nanos:867839761}" Jul 9 15:02:26.876020 containerd[1558]: time="2025-07-09T15:02:26.872613666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"155d859add5eef0a4d5dda43479fa97ee98e1b3e27d80b970a336b55037af5e4\" pid:6281 exited_at:{seconds:1752073346 nanos:869660388}" Jul 9 15:02:26.881642 containerd[1558]: time="2025-07-09T15:02:26.881427998Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"92a1e2e3782e13796ab9396efc78d68440228763315bc6f95f721bb8c2ba2646\" pid:6320 exited_at:{seconds:1752073346 nanos:880029496}" Jul 9 15:02:26.895156 containerd[1558]: time="2025-07-09T15:02:26.895086898Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"0349a3eb2a647cf3ee1d5604d88e546c98035d0eeb6e3e73e7764918203fd8d0\" pid:6329 exited_at:{seconds:1752073346 nanos:894874358}" Jul 9 15:02:27.927347 containerd[1558]: time="2025-07-09T15:02:27.927093057Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"a9a54b0279ab77adb676e022459ae8aeea5355332ea07ee04de428d051533be1\" pid:6367 exited_at:{seconds:1752073347 nanos:926787151}" Jul 9 15:02:43.265909 containerd[1558]: time="2025-07-09T15:02:43.265306928Z" level=warning msg="container event discarded" container=3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2 type=CONTAINER_CREATED_EVENT Jul 9 15:02:43.265909 containerd[1558]: time="2025-07-09T15:02:43.265616581Z" level=warning msg="container event discarded" container=3858bfd7437667df414c7d203eed8a6b9c5f9a22c012240855262971729faca2 type=CONTAINER_STARTED_EVENT Jul 9 15:02:43.291005 containerd[1558]: time="2025-07-09T15:02:43.290833442Z" level=warning msg="container event discarded" container=e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead type=CONTAINER_CREATED_EVENT Jul 9 15:02:43.291005 containerd[1558]: time="2025-07-09T15:02:43.290890440Z" level=warning msg="container event discarded" container=e5bf610cfd0e8d3ffdda8685a3e13527bc853655a94ab75c4eda0f171daeeead type=CONTAINER_STARTED_EVENT Jul 9 15:02:43.319194 containerd[1558]: time="2025-07-09T15:02:43.319103309Z" level=warning msg="container event discarded" container=58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58 type=CONTAINER_CREATED_EVENT Jul 9 15:02:43.319194 containerd[1558]: time="2025-07-09T15:02:43.319137323Z" level=warning msg="container event discarded" container=96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39 type=CONTAINER_CREATED_EVENT Jul 9 15:02:43.319194 containerd[1558]: time="2025-07-09T15:02:43.319146661Z" level=warning msg="container event discarded" container=96f82615bf7b561e5aabc4340bb3b165cf8fbdeea47ec7dbca1399057f3f2a39 type=CONTAINER_STARTED_EVENT Jul 9 15:02:43.345503 containerd[1558]: time="2025-07-09T15:02:43.345339279Z" level=warning msg="container event discarded" container=ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1 type=CONTAINER_CREATED_EVENT Jul 9 15:02:43.367550 containerd[1558]: time="2025-07-09T15:02:43.367501761Z" level=warning msg="container event discarded" container=66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d type=CONTAINER_CREATED_EVENT Jul 9 15:02:43.479059 containerd[1558]: time="2025-07-09T15:02:43.478918521Z" level=warning msg="container event discarded" container=58b18b0d70b2222620d23f3f729e1ec78221364de451a263117e20f4618dfa58 type=CONTAINER_STARTED_EVENT Jul 9 15:02:43.495260 containerd[1558]: time="2025-07-09T15:02:43.495147316Z" level=warning msg="container event discarded" container=ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1 type=CONTAINER_STARTED_EVENT Jul 9 15:02:43.525623 containerd[1558]: time="2025-07-09T15:02:43.525372263Z" level=warning msg="container event discarded" container=66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d type=CONTAINER_STARTED_EVENT Jul 9 15:02:56.085845 containerd[1558]: time="2025-07-09T15:02:56.085287696Z" level=warning msg="container event discarded" container=b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565 type=CONTAINER_CREATED_EVENT Jul 9 15:02:56.085845 containerd[1558]: time="2025-07-09T15:02:56.085680456Z" level=warning msg="container event discarded" container=b213d4bc0faec35c24c194415341dc8696e14fc8d3f55a69f04bb42ff89a9565 type=CONTAINER_STARTED_EVENT Jul 9 15:02:56.128132 containerd[1558]: time="2025-07-09T15:02:56.128050997Z" level=warning msg="container event discarded" container=0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8 type=CONTAINER_CREATED_EVENT Jul 9 15:02:56.259845 containerd[1558]: time="2025-07-09T15:02:56.259555031Z" level=warning msg="container event discarded" container=0545f9d16965ebb5999640aa303c501e8621b277463ad503f9601964f29b01d8 type=CONTAINER_STARTED_EVENT Jul 9 15:02:56.295522 containerd[1558]: time="2025-07-09T15:02:56.295350438Z" level=warning msg="container event discarded" container=34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89 type=CONTAINER_CREATED_EVENT Jul 9 15:02:56.295522 containerd[1558]: time="2025-07-09T15:02:56.295414078Z" level=warning msg="container event discarded" container=34ab10e4db784f3c21273b32ac13cb598a0517d354e406a53d04ea6e58731f89 type=CONTAINER_STARTED_EVENT Jul 9 15:02:56.810528 containerd[1558]: time="2025-07-09T15:02:56.809200334Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"8144e605f3d0293821df30f676fd2e7e7394ae2c142cc97deedc53ba6c0b762f\" pid:6398 exited_at:{seconds:1752073376 nanos:807551661}" Jul 9 15:02:56.826000 containerd[1558]: time="2025-07-09T15:02:56.825897058Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"efabc93094fbe49eddc89581e981f88d56c28f6321ff9b5041a74cc85e8ab155\" pid:6420 exited_at:{seconds:1752073376 nanos:822306862}" Jul 9 15:02:57.982305 containerd[1558]: time="2025-07-09T15:02:57.982223671Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"9b5fdbb0e86388c9e7468142b76c6f859a7e22fdef5511a2396f327d396dcef2\" pid:6441 exited_at:{seconds:1752073377 nanos:981619504}" Jul 9 15:02:59.766705 containerd[1558]: time="2025-07-09T15:02:59.766321448Z" level=warning msg="container event discarded" container=e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e type=CONTAINER_CREATED_EVENT Jul 9 15:03:00.007389 containerd[1558]: time="2025-07-09T15:03:00.007151862Z" level=warning msg="container event discarded" container=e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e type=CONTAINER_STARTED_EVENT Jul 9 15:03:12.384670 update_engine[1534]: I20250709 15:03:12.383983 1534 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jul 9 15:03:12.384670 update_engine[1534]: I20250709 15:03:12.384395 1534 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jul 9 15:03:12.389237 update_engine[1534]: I20250709 15:03:12.386710 1534 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jul 9 15:03:12.390390 update_engine[1534]: I20250709 15:03:12.390312 1534 omaha_request_params.cc:62] Current group set to developer Jul 9 15:03:12.395569 update_engine[1534]: I20250709 15:03:12.395022 1534 update_attempter.cc:499] Already updated boot flags. Skipping. Jul 9 15:03:12.395569 update_engine[1534]: I20250709 15:03:12.395066 1534 update_attempter.cc:643] Scheduling an action processor start. Jul 9 15:03:12.395569 update_engine[1534]: I20250709 15:03:12.395123 1534 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 9 15:03:12.395569 update_engine[1534]: I20250709 15:03:12.395373 1534 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jul 9 15:03:12.397136 update_engine[1534]: I20250709 15:03:12.397073 1534 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 9 15:03:12.397136 update_engine[1534]: I20250709 15:03:12.397116 1534 omaha_request_action.cc:272] Request: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.397136 update_engine[1534]: Jul 9 15:03:12.398024 update_engine[1534]: I20250709 15:03:12.397153 1534 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 9 15:03:12.411507 update_engine[1534]: I20250709 15:03:12.408620 1534 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 9 15:03:12.411507 update_engine[1534]: I20250709 15:03:12.409181 1534 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 9 15:03:12.417764 locksmithd[1572]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jul 9 15:03:12.418264 update_engine[1534]: E20250709 15:03:12.417594 1534 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 9 15:03:12.418264 update_engine[1534]: I20250709 15:03:12.417699 1534 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jul 9 15:03:14.704851 containerd[1558]: time="2025-07-09T15:03:14.703748399Z" level=warning msg="container event discarded" container=6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c type=CONTAINER_CREATED_EVENT Jul 9 15:03:14.704851 containerd[1558]: time="2025-07-09T15:03:14.704874888Z" level=warning msg="container event discarded" container=6ae00df21446f64a3516fe17c241a09fa9e9c92c85165e0b2541c2202e4f225c type=CONTAINER_STARTED_EVENT Jul 9 15:03:14.817777 containerd[1558]: time="2025-07-09T15:03:14.817730132Z" level=warning msg="container event discarded" container=09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac type=CONTAINER_CREATED_EVENT Jul 9 15:03:14.817994 containerd[1558]: time="2025-07-09T15:03:14.817974332Z" level=warning msg="container event discarded" container=09c013071b0a3fdf2950a77c66169188cb984010ba03e006d9dfb797001e3dac type=CONTAINER_STARTED_EVENT Jul 9 15:03:18.641571 systemd[1]: Started sshd@9-172.24.4.222:22-172.24.4.1:58028.service - OpenSSH per-connection server daemon (172.24.4.1:58028). Jul 9 15:03:20.076309 sshd[6460]: Accepted publickey for core from 172.24.4.1 port 58028 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:20.081104 sshd-session[6460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:20.100031 systemd-logind[1533]: New session 12 of user core. Jul 9 15:03:20.110839 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 9 15:03:21.041916 sshd[6463]: Connection closed by 172.24.4.1 port 58028 Jul 9 15:03:21.041634 sshd-session[6460]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:21.050845 systemd[1]: sshd@9-172.24.4.222:22-172.24.4.1:58028.service: Deactivated successfully. Jul 9 15:03:21.058903 systemd[1]: session-12.scope: Deactivated successfully. Jul 9 15:03:21.065268 systemd-logind[1533]: Session 12 logged out. Waiting for processes to exit. Jul 9 15:03:21.070590 systemd-logind[1533]: Removed session 12. Jul 9 15:03:22.309507 update_engine[1534]: I20250709 15:03:22.308564 1534 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 9 15:03:22.309507 update_engine[1534]: I20250709 15:03:22.308964 1534 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 9 15:03:22.309507 update_engine[1534]: I20250709 15:03:22.309288 1534 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 9 15:03:22.314893 update_engine[1534]: E20250709 15:03:22.314703 1534 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 9 15:03:22.314893 update_engine[1534]: I20250709 15:03:22.314846 1534 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jul 9 15:03:26.059392 systemd[1]: Started sshd@10-172.24.4.222:22-172.24.4.1:56470.service - OpenSSH per-connection server daemon (172.24.4.1:56470). Jul 9 15:03:27.038326 containerd[1558]: time="2025-07-09T15:03:27.038127555Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"44ca556557cd7117eff0259f00798a4c3649c3ed34e06ca73f89e76929024c91\" pid:6552 exited_at:{seconds:1752073407 nanos:37492960}" Jul 9 15:03:27.071963 containerd[1558]: time="2025-07-09T15:03:27.071904838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"23a765a2b2a803ad45c5aaa8aefa3011b0184407dd068548ae5e2d9d864360f8\" pid:6549 exited_at:{seconds:1752073407 nanos:70917570}" Jul 9 15:03:27.140065 containerd[1558]: time="2025-07-09T15:03:27.139934349Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"94166bf057528a28e6262fbcb88de49b3762f01312c637bb57d1302dadad40a1\" pid:6504 exited_at:{seconds:1752073407 nanos:138607852}" Jul 9 15:03:27.140663 containerd[1558]: time="2025-07-09T15:03:27.140621742Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"252b65e3e9ec4a470d257370bb69b189d0ddc080cfd7146506ae517e72719ad8\" pid:6506 exited_at:{seconds:1752073407 nanos:139777113}" Jul 9 15:03:27.358898 sshd[6476]: Accepted publickey for core from 172.24.4.1 port 56470 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:27.361577 sshd-session[6476]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:27.375286 systemd-logind[1533]: New session 13 of user core. Jul 9 15:03:27.378785 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 9 15:03:28.135177 sshd[6572]: Connection closed by 172.24.4.1 port 56470 Jul 9 15:03:28.135568 sshd-session[6476]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:28.143001 systemd[1]: sshd@10-172.24.4.222:22-172.24.4.1:56470.service: Deactivated successfully. Jul 9 15:03:28.143537 systemd-logind[1533]: Session 13 logged out. Waiting for processes to exit. Jul 9 15:03:28.147550 systemd[1]: session-13.scope: Deactivated successfully. Jul 9 15:03:28.153668 systemd-logind[1533]: Removed session 13. Jul 9 15:03:28.171891 containerd[1558]: time="2025-07-09T15:03:28.171696475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"05b93a6cc8425f27e1bd71efd93e18d0e6bbe5bfe3561d050ec9a7cdb0a58b1e\" pid:6593 exited_at:{seconds:1752073408 nanos:170927929}" Jul 9 15:03:32.310133 update_engine[1534]: I20250709 15:03:32.309737 1534 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 9 15:03:32.311376 update_engine[1534]: I20250709 15:03:32.311196 1534 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 9 15:03:32.312780 update_engine[1534]: I20250709 15:03:32.312705 1534 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 9 15:03:32.318048 update_engine[1534]: E20250709 15:03:32.317950 1534 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 9 15:03:32.318236 update_engine[1534]: I20250709 15:03:32.318120 1534 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jul 9 15:03:33.176174 systemd[1]: Started sshd@11-172.24.4.222:22-172.24.4.1:56480.service - OpenSSH per-connection server daemon (172.24.4.1:56480). Jul 9 15:03:34.358886 sshd[6614]: Accepted publickey for core from 172.24.4.1 port 56480 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:34.363160 sshd-session[6614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:34.380956 systemd-logind[1533]: New session 14 of user core. Jul 9 15:03:34.390838 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 9 15:03:35.208181 sshd[6617]: Connection closed by 172.24.4.1 port 56480 Jul 9 15:03:35.205868 sshd-session[6614]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:35.229110 systemd[1]: sshd@11-172.24.4.222:22-172.24.4.1:56480.service: Deactivated successfully. Jul 9 15:03:35.236967 systemd[1]: session-14.scope: Deactivated successfully. Jul 9 15:03:35.240237 systemd-logind[1533]: Session 14 logged out. Waiting for processes to exit. Jul 9 15:03:35.251279 systemd[1]: Started sshd@12-172.24.4.222:22-172.24.4.1:50130.service - OpenSSH per-connection server daemon (172.24.4.1:50130). Jul 9 15:03:35.254186 systemd-logind[1533]: Removed session 14. Jul 9 15:03:36.540608 sshd[6631]: Accepted publickey for core from 172.24.4.1 port 50130 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:36.545222 sshd-session[6631]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:36.555987 systemd-logind[1533]: New session 15 of user core. Jul 9 15:03:36.561656 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 9 15:03:37.567153 sshd[6634]: Connection closed by 172.24.4.1 port 50130 Jul 9 15:03:37.567006 sshd-session[6631]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:37.584252 systemd[1]: sshd@12-172.24.4.222:22-172.24.4.1:50130.service: Deactivated successfully. Jul 9 15:03:37.589953 systemd[1]: session-15.scope: Deactivated successfully. Jul 9 15:03:37.591986 systemd-logind[1533]: Session 15 logged out. Waiting for processes to exit. Jul 9 15:03:37.599328 systemd[1]: Started sshd@13-172.24.4.222:22-172.24.4.1:50132.service - OpenSSH per-connection server daemon (172.24.4.1:50132). Jul 9 15:03:37.601021 systemd-logind[1533]: Removed session 15. Jul 9 15:03:38.923392 sshd[6643]: Accepted publickey for core from 172.24.4.1 port 50132 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:38.927729 sshd-session[6643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:38.939617 systemd-logind[1533]: New session 16 of user core. Jul 9 15:03:38.952796 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 9 15:03:39.876605 sshd[6646]: Connection closed by 172.24.4.1 port 50132 Jul 9 15:03:39.878301 sshd-session[6643]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:39.891823 systemd[1]: sshd@13-172.24.4.222:22-172.24.4.1:50132.service: Deactivated successfully. Jul 9 15:03:39.899220 systemd[1]: session-16.scope: Deactivated successfully. Jul 9 15:03:39.904202 systemd-logind[1533]: Session 16 logged out. Waiting for processes to exit. Jul 9 15:03:39.908305 systemd-logind[1533]: Removed session 16. Jul 9 15:03:42.310181 update_engine[1534]: I20250709 15:03:42.310003 1534 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 9 15:03:42.311197 update_engine[1534]: I20250709 15:03:42.310827 1534 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 9 15:03:42.311650 update_engine[1534]: I20250709 15:03:42.311566 1534 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 9 15:03:42.319665 update_engine[1534]: E20250709 15:03:42.319581 1534 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 9 15:03:42.319930 update_engine[1534]: I20250709 15:03:42.319682 1534 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 9 15:03:42.319930 update_engine[1534]: I20250709 15:03:42.319725 1534 omaha_request_action.cc:617] Omaha request response: Jul 9 15:03:42.320094 update_engine[1534]: E20250709 15:03:42.319974 1534 omaha_request_action.cc:636] Omaha request network transfer failed. Jul 9 15:03:42.321503 update_engine[1534]: I20250709 15:03:42.320562 1534 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jul 9 15:03:42.321503 update_engine[1534]: I20250709 15:03:42.320598 1534 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 9 15:03:42.321503 update_engine[1534]: I20250709 15:03:42.320621 1534 update_attempter.cc:306] Processing Done. Jul 9 15:03:42.321503 update_engine[1534]: E20250709 15:03:42.320735 1534 update_attempter.cc:619] Update failed. Jul 9 15:03:42.321503 update_engine[1534]: I20250709 15:03:42.320765 1534 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jul 9 15:03:42.321503 update_engine[1534]: I20250709 15:03:42.320776 1534 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jul 9 15:03:42.321503 update_engine[1534]: I20250709 15:03:42.320789 1534 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jul 9 15:03:42.322229 update_engine[1534]: I20250709 15:03:42.321638 1534 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 9 15:03:42.322229 update_engine[1534]: I20250709 15:03:42.321808 1534 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 9 15:03:42.322229 update_engine[1534]: I20250709 15:03:42.321825 1534 omaha_request_action.cc:272] Request: Jul 9 15:03:42.322229 update_engine[1534]: Jul 9 15:03:42.322229 update_engine[1534]: Jul 9 15:03:42.322229 update_engine[1534]: Jul 9 15:03:42.322229 update_engine[1534]: Jul 9 15:03:42.322229 update_engine[1534]: Jul 9 15:03:42.322229 update_engine[1534]: Jul 9 15:03:42.322229 update_engine[1534]: I20250709 15:03:42.321837 1534 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 9 15:03:42.322229 update_engine[1534]: I20250709 15:03:42.322166 1534 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 9 15:03:42.323194 update_engine[1534]: I20250709 15:03:42.322741 1534 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 9 15:03:42.326113 locksmithd[1572]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jul 9 15:03:42.328114 update_engine[1534]: E20250709 15:03:42.328032 1534 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328135 1534 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328152 1534 omaha_request_action.cc:617] Omaha request response: Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328166 1534 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328176 1534 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328186 1534 update_attempter.cc:306] Processing Done. Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328198 1534 update_attempter.cc:310] Error event sent. Jul 9 15:03:42.328312 update_engine[1534]: I20250709 15:03:42.328261 1534 update_check_scheduler.cc:74] Next update check in 43m55s Jul 9 15:03:42.329856 locksmithd[1572]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jul 9 15:03:44.899622 systemd[1]: Started sshd@14-172.24.4.222:22-172.24.4.1:60170.service - OpenSSH per-connection server daemon (172.24.4.1:60170). Jul 9 15:03:46.185544 sshd[6658]: Accepted publickey for core from 172.24.4.1 port 60170 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:46.187753 sshd-session[6658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:46.199177 systemd-logind[1533]: New session 17 of user core. Jul 9 15:03:46.209744 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 9 15:03:47.115604 sshd[6661]: Connection closed by 172.24.4.1 port 60170 Jul 9 15:03:47.117652 sshd-session[6658]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:47.121582 systemd-logind[1533]: Session 17 logged out. Waiting for processes to exit. Jul 9 15:03:47.122667 systemd[1]: sshd@14-172.24.4.222:22-172.24.4.1:60170.service: Deactivated successfully. Jul 9 15:03:47.129580 systemd[1]: session-17.scope: Deactivated successfully. Jul 9 15:03:47.136556 systemd-logind[1533]: Removed session 17. Jul 9 15:03:50.414877 containerd[1558]: time="2025-07-09T15:03:50.414570106Z" level=warning msg="container event discarded" container=66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d type=CONTAINER_STOPPED_EVENT Jul 9 15:03:50.502068 containerd[1558]: time="2025-07-09T15:03:50.501937760Z" level=warning msg="container event discarded" container=e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e type=CONTAINER_STOPPED_EVENT Jul 9 15:03:50.865831 containerd[1558]: time="2025-07-09T15:03:50.865748255Z" level=warning msg="container event discarded" container=90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57 type=CONTAINER_CREATED_EVENT Jul 9 15:03:51.128876 containerd[1558]: time="2025-07-09T15:03:51.128396626Z" level=warning msg="container event discarded" container=4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a type=CONTAINER_CREATED_EVENT Jul 9 15:03:51.128876 containerd[1558]: time="2025-07-09T15:03:51.128694787Z" level=warning msg="container event discarded" container=90559f9fff6bfda99bd405d43fadce1c8e56759920c87d2f5e71e110dd540d57 type=CONTAINER_STARTED_EVENT Jul 9 15:03:51.209761 containerd[1558]: time="2025-07-09T15:03:51.209640869Z" level=warning msg="container event discarded" container=d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a type=CONTAINER_CREATED_EVENT Jul 9 15:03:51.377349 containerd[1558]: time="2025-07-09T15:03:51.377006400Z" level=warning msg="container event discarded" container=4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a type=CONTAINER_STARTED_EVENT Jul 9 15:03:51.419405 containerd[1558]: time="2025-07-09T15:03:51.419060394Z" level=warning msg="container event discarded" container=d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a type=CONTAINER_STARTED_EVENT Jul 9 15:03:52.138281 systemd[1]: Started sshd@15-172.24.4.222:22-172.24.4.1:60178.service - OpenSSH per-connection server daemon (172.24.4.1:60178). Jul 9 15:03:52.489714 containerd[1558]: time="2025-07-09T15:03:52.489318831Z" level=warning msg="container event discarded" container=ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1 type=CONTAINER_STOPPED_EVENT Jul 9 15:03:53.011188 containerd[1558]: time="2025-07-09T15:03:53.010985470Z" level=warning msg="container event discarded" container=59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b type=CONTAINER_CREATED_EVENT Jul 9 15:03:53.351616 containerd[1558]: time="2025-07-09T15:03:53.349695238Z" level=warning msg="container event discarded" container=59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b type=CONTAINER_STARTED_EVENT Jul 9 15:03:53.418223 containerd[1558]: time="2025-07-09T15:03:53.418053903Z" level=warning msg="container event discarded" container=af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2 type=CONTAINER_CREATED_EVENT Jul 9 15:03:53.578983 sshd[6679]: Accepted publickey for core from 172.24.4.1 port 60178 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:03:53.582150 sshd-session[6679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:03:53.599103 systemd-logind[1533]: New session 18 of user core. Jul 9 15:03:53.608937 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 9 15:03:53.979806 containerd[1558]: time="2025-07-09T15:03:53.978084812Z" level=warning msg="container event discarded" container=af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2 type=CONTAINER_STARTED_EVENT Jul 9 15:03:54.227022 containerd[1558]: time="2025-07-09T15:03:54.226920388Z" level=warning msg="container event discarded" container=af274f07bbde39c6b492d8c667ccc09717489734613da9e573938dca7def9ef2 type=CONTAINER_STOPPED_EVENT Jul 9 15:03:54.266647 sshd[6682]: Connection closed by 172.24.4.1 port 60178 Jul 9 15:03:54.267772 sshd-session[6679]: pam_unix(sshd:session): session closed for user core Jul 9 15:03:54.279123 systemd[1]: sshd@15-172.24.4.222:22-172.24.4.1:60178.service: Deactivated successfully. Jul 9 15:03:54.289131 systemd[1]: session-18.scope: Deactivated successfully. Jul 9 15:03:54.299972 systemd-logind[1533]: Session 18 logged out. Waiting for processes to exit. Jul 9 15:03:54.304180 systemd-logind[1533]: Removed session 18. Jul 9 15:03:56.711125 containerd[1558]: time="2025-07-09T15:03:56.711057076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"5e832059489da615bc173dd77b744cc310c43b05cb5260f4ef88e5f5d4bf81cf\" pid:6706 exited_at:{seconds:1752073436 nanos:710506520}" Jul 9 15:03:56.797210 containerd[1558]: time="2025-07-09T15:03:56.796234598Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"e6c9a182f44c4efff59e22dea066172cf3e0268de355ac5e1d2d79609c0a92f1\" pid:6728 exited_at:{seconds:1752073436 nanos:795933762}" Jul 9 15:03:57.978632 containerd[1558]: time="2025-07-09T15:03:57.976556830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"df331f5775e46463caa82f81ecaeafd51edeb95f49efed81f9a382fbcb95d3c9\" pid:6749 exited_at:{seconds:1752073437 nanos:974246152}" Jul 9 15:03:59.289637 systemd[1]: Started sshd@16-172.24.4.222:22-172.24.4.1:57380.service - OpenSSH per-connection server daemon (172.24.4.1:57380). Jul 9 15:04:00.586127 sshd[6768]: Accepted publickey for core from 172.24.4.1 port 57380 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:00.590276 sshd-session[6768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:00.602604 systemd-logind[1533]: New session 19 of user core. Jul 9 15:04:00.611772 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 9 15:04:01.567921 sshd[6771]: Connection closed by 172.24.4.1 port 57380 Jul 9 15:04:01.569376 sshd-session[6768]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:01.579860 systemd[1]: sshd@16-172.24.4.222:22-172.24.4.1:57380.service: Deactivated successfully. Jul 9 15:04:01.586929 systemd[1]: session-19.scope: Deactivated successfully. Jul 9 15:04:01.591221 systemd-logind[1533]: Session 19 logged out. Waiting for processes to exit. Jul 9 15:04:01.595671 systemd-logind[1533]: Removed session 19. Jul 9 15:04:06.595109 systemd[1]: Started sshd@17-172.24.4.222:22-172.24.4.1:35078.service - OpenSSH per-connection server daemon (172.24.4.1:35078). Jul 9 15:04:07.992385 sshd[6795]: Accepted publickey for core from 172.24.4.1 port 35078 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:08.000105 sshd-session[6795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:08.017229 systemd-logind[1533]: New session 20 of user core. Jul 9 15:04:08.032306 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 9 15:04:08.785505 sshd[6798]: Connection closed by 172.24.4.1 port 35078 Jul 9 15:04:08.786221 sshd-session[6795]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:08.805171 systemd[1]: sshd@17-172.24.4.222:22-172.24.4.1:35078.service: Deactivated successfully. Jul 9 15:04:08.809817 systemd[1]: session-20.scope: Deactivated successfully. Jul 9 15:04:08.814525 systemd-logind[1533]: Session 20 logged out. Waiting for processes to exit. Jul 9 15:04:08.817339 systemd-logind[1533]: Removed session 20. Jul 9 15:04:08.821719 systemd[1]: Started sshd@18-172.24.4.222:22-172.24.4.1:35094.service - OpenSSH per-connection server daemon (172.24.4.1:35094). Jul 9 15:04:10.176793 sshd[6809]: Accepted publickey for core from 172.24.4.1 port 35094 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:10.182903 sshd-session[6809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:10.202044 systemd-logind[1533]: New session 21 of user core. Jul 9 15:04:10.215888 systemd[1]: Started session-21.scope - Session 21 of User core. Jul 9 15:04:11.571732 sshd[6812]: Connection closed by 172.24.4.1 port 35094 Jul 9 15:04:11.573566 sshd-session[6809]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:11.593440 systemd[1]: sshd@18-172.24.4.222:22-172.24.4.1:35094.service: Deactivated successfully. Jul 9 15:04:11.600347 systemd[1]: session-21.scope: Deactivated successfully. Jul 9 15:04:11.603678 systemd-logind[1533]: Session 21 logged out. Waiting for processes to exit. Jul 9 15:04:11.614303 systemd[1]: Started sshd@19-172.24.4.222:22-172.24.4.1:35110.service - OpenSSH per-connection server daemon (172.24.4.1:35110). Jul 9 15:04:11.617419 systemd-logind[1533]: Removed session 21. Jul 9 15:04:12.767562 sshd[6823]: Accepted publickey for core from 172.24.4.1 port 35110 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:12.772222 sshd-session[6823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:12.788429 systemd-logind[1533]: New session 22 of user core. Jul 9 15:04:12.795885 systemd[1]: Started session-22.scope - Session 22 of User core. Jul 9 15:04:16.816543 sshd[6826]: Connection closed by 172.24.4.1 port 35110 Jul 9 15:04:16.822564 sshd-session[6823]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:16.838072 systemd[1]: sshd@19-172.24.4.222:22-172.24.4.1:35110.service: Deactivated successfully. Jul 9 15:04:16.847158 systemd[1]: session-22.scope: Deactivated successfully. Jul 9 15:04:16.847638 systemd[1]: session-22.scope: Consumed 1.132s CPU time, 72.8M memory peak. Jul 9 15:04:16.849865 systemd-logind[1533]: Session 22 logged out. Waiting for processes to exit. Jul 9 15:04:16.862290 systemd[1]: Started sshd@20-172.24.4.222:22-172.24.4.1:53214.service - OpenSSH per-connection server daemon (172.24.4.1:53214). Jul 9 15:04:16.878147 systemd-logind[1533]: Removed session 22. Jul 9 15:04:17.548176 containerd[1558]: time="2025-07-09T15:04:17.547948048Z" level=warning msg="container event discarded" container=ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15 type=CONTAINER_CREATED_EVENT Jul 9 15:04:17.807750 containerd[1558]: time="2025-07-09T15:04:17.807591948Z" level=warning msg="container event discarded" container=ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15 type=CONTAINER_STARTED_EVENT Jul 9 15:04:18.154573 sshd[6847]: Accepted publickey for core from 172.24.4.1 port 53214 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:18.160697 sshd-session[6847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:18.170280 systemd-logind[1533]: New session 23 of user core. Jul 9 15:04:18.175778 systemd[1]: Started session-23.scope - Session 23 of User core. Jul 9 15:04:19.131404 sshd[6850]: Connection closed by 172.24.4.1 port 53214 Jul 9 15:04:19.132195 sshd-session[6847]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:19.147257 systemd[1]: sshd@20-172.24.4.222:22-172.24.4.1:53214.service: Deactivated successfully. Jul 9 15:04:19.150070 systemd[1]: session-23.scope: Deactivated successfully. Jul 9 15:04:19.152852 systemd-logind[1533]: Session 23 logged out. Waiting for processes to exit. Jul 9 15:04:19.158864 systemd[1]: Started sshd@21-172.24.4.222:22-172.24.4.1:53224.service - OpenSSH per-connection server daemon (172.24.4.1:53224). Jul 9 15:04:19.160685 systemd-logind[1533]: Removed session 23. Jul 9 15:04:20.333316 sshd[6860]: Accepted publickey for core from 172.24.4.1 port 53224 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:20.335665 sshd-session[6860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:20.346371 systemd-logind[1533]: New session 24 of user core. Jul 9 15:04:20.351673 systemd[1]: Started session-24.scope - Session 24 of User core. Jul 9 15:04:21.097278 sshd[6863]: Connection closed by 172.24.4.1 port 53224 Jul 9 15:04:21.096617 sshd-session[6860]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:21.102186 systemd-logind[1533]: Session 24 logged out. Waiting for processes to exit. Jul 9 15:04:21.103125 systemd[1]: sshd@21-172.24.4.222:22-172.24.4.1:53224.service: Deactivated successfully. Jul 9 15:04:21.108428 systemd[1]: session-24.scope: Deactivated successfully. Jul 9 15:04:21.113559 systemd-logind[1533]: Removed session 24. Jul 9 15:04:26.123742 systemd[1]: Started sshd@22-172.24.4.222:22-172.24.4.1:53352.service - OpenSSH per-connection server daemon (172.24.4.1:53352). Jul 9 15:04:26.913925 containerd[1558]: time="2025-07-09T15:04:26.913874628Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"b8b24571b70ac5a590fa6a231d2b471f4f22e688e114e1c2f340c3a959d609da\" pid:6951 exited_at:{seconds:1752073466 nanos:913227881}" Jul 9 15:04:26.926031 containerd[1558]: time="2025-07-09T15:04:26.925968395Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"ec4a67efe2ae06e251de432890a0d87c5f72cf161ed0d94cfb5b98d02e191f5c\" pid:6935 exited_at:{seconds:1752073466 nanos:925585324}" Jul 9 15:04:27.040726 containerd[1558]: time="2025-07-09T15:04:27.040651401Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"cb32fb8a23484dfc3d43bec103b98af8012651b23024adc864bf042abd68489b\" pid:6911 exited_at:{seconds:1752073467 nanos:39872095}" Jul 9 15:04:27.058962 containerd[1558]: time="2025-07-09T15:04:27.058898937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"b7b572c37c43897bf8b421221a0d878a75c3025fc792363f76c22d7220f8309f\" pid:6909 exited_at:{seconds:1752073467 nanos:58337360}" Jul 9 15:04:27.293010 sshd[6880]: Accepted publickey for core from 172.24.4.1 port 53352 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:27.294416 sshd-session[6880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:27.301810 systemd-logind[1533]: New session 25 of user core. Jul 9 15:04:27.309686 systemd[1]: Started session-25.scope - Session 25 of User core. Jul 9 15:04:28.034542 containerd[1558]: time="2025-07-09T15:04:28.034420492Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"8b3f8ad8cb4abe8844412721caffa8a03b1e4cae432154d33d93af78927525c8\" pid:6992 exited_at:{seconds:1752073468 nanos:33974855}" Jul 9 15:04:28.040955 sshd[6971]: Connection closed by 172.24.4.1 port 53352 Jul 9 15:04:28.041558 sshd-session[6880]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:28.046863 systemd[1]: sshd@22-172.24.4.222:22-172.24.4.1:53352.service: Deactivated successfully. Jul 9 15:04:28.050488 systemd[1]: session-25.scope: Deactivated successfully. Jul 9 15:04:28.052109 systemd-logind[1533]: Session 25 logged out. Waiting for processes to exit. Jul 9 15:04:28.055326 systemd-logind[1533]: Removed session 25. Jul 9 15:04:33.079841 systemd[1]: Started sshd@23-172.24.4.222:22-172.24.4.1:53360.service - OpenSSH per-connection server daemon (172.24.4.1:53360). Jul 9 15:04:34.366144 sshd[7008]: Accepted publickey for core from 172.24.4.1 port 53360 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:34.370027 sshd-session[7008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:34.387140 systemd-logind[1533]: New session 26 of user core. Jul 9 15:04:34.398963 systemd[1]: Started session-26.scope - Session 26 of User core. Jul 9 15:04:35.148339 sshd[7011]: Connection closed by 172.24.4.1 port 53360 Jul 9 15:04:35.149876 sshd-session[7008]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:35.160486 systemd[1]: sshd@23-172.24.4.222:22-172.24.4.1:53360.service: Deactivated successfully. Jul 9 15:04:35.165871 systemd[1]: session-26.scope: Deactivated successfully. Jul 9 15:04:35.168101 systemd-logind[1533]: Session 26 logged out. Waiting for processes to exit. Jul 9 15:04:35.171167 systemd-logind[1533]: Removed session 26. Jul 9 15:04:40.186284 systemd[1]: Started sshd@24-172.24.4.222:22-172.24.4.1:60402.service - OpenSSH per-connection server daemon (172.24.4.1:60402). Jul 9 15:04:41.308835 sshd[7023]: Accepted publickey for core from 172.24.4.1 port 60402 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:41.314609 sshd-session[7023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:41.329631 systemd-logind[1533]: New session 27 of user core. Jul 9 15:04:41.336886 systemd[1]: Started session-27.scope - Session 27 of User core. Jul 9 15:04:42.052264 sshd[7026]: Connection closed by 172.24.4.1 port 60402 Jul 9 15:04:42.054123 sshd-session[7023]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:42.070110 systemd[1]: sshd@24-172.24.4.222:22-172.24.4.1:60402.service: Deactivated successfully. Jul 9 15:04:42.075951 systemd[1]: session-27.scope: Deactivated successfully. Jul 9 15:04:42.079112 systemd-logind[1533]: Session 27 logged out. Waiting for processes to exit. Jul 9 15:04:42.084323 systemd-logind[1533]: Removed session 27. Jul 9 15:04:45.956437 containerd[1558]: time="2025-07-09T15:04:45.956037082Z" level=warning msg="container event discarded" container=59f8178774a7bbd93a11b66c72a45273a415100c1de7991719d19f623c0a6f3b type=CONTAINER_STOPPED_EVENT Jul 9 15:04:46.372708 containerd[1558]: time="2025-07-09T15:04:46.372595536Z" level=warning msg="container event discarded" container=4317f8a56a89dc346328904c860065f3e10dc519cbe11fc851ff11951f42246a type=CONTAINER_STOPPED_EVENT Jul 9 15:04:46.672587 containerd[1558]: time="2025-07-09T15:04:46.672206572Z" level=warning msg="container event discarded" container=d18507d651687e264468c26c05a2b79703e1108972668f8ac5c3f42a172e9d6a type=CONTAINER_STOPPED_EVENT Jul 9 15:04:47.076046 systemd[1]: Started sshd@25-172.24.4.222:22-172.24.4.1:51572.service - OpenSSH per-connection server daemon (172.24.4.1:51572). Jul 9 15:04:47.105863 containerd[1558]: time="2025-07-09T15:04:47.105715748Z" level=warning msg="container event discarded" container=66bcfa175810d14547030327789fc720bdb70668a27956317df4e61c90abd86d type=CONTAINER_DELETED_EVENT Jul 9 15:04:47.650952 containerd[1558]: time="2025-07-09T15:04:47.650432370Z" level=warning msg="container event discarded" container=ed5c418aa8618cec9b4332d2ab71a9b8621027e60be06a7cafc13619e5a4dcb1 type=CONTAINER_DELETED_EVENT Jul 9 15:04:47.876200 containerd[1558]: time="2025-07-09T15:04:47.876004513Z" level=warning msg="container event discarded" container=ef3aac7a5af56329a49d0edb152a6226d69674166b4ed84bc61c5c082fa93c15 type=CONTAINER_STOPPED_EVENT Jul 9 15:04:47.990951 containerd[1558]: time="2025-07-09T15:04:47.990554751Z" level=warning msg="container event discarded" container=57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456 type=CONTAINER_CREATED_EVENT Jul 9 15:04:48.247558 containerd[1558]: time="2025-07-09T15:04:48.247037942Z" level=warning msg="container event discarded" container=9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d type=CONTAINER_CREATED_EVENT Jul 9 15:04:48.383723 containerd[1558]: time="2025-07-09T15:04:48.383577875Z" level=warning msg="container event discarded" container=0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4 type=CONTAINER_CREATED_EVENT Jul 9 15:04:48.383723 containerd[1558]: time="2025-07-09T15:04:48.383653727Z" level=warning msg="container event discarded" container=57c9b947afe276b5bdb772cec5637965e0580007a40f77c802ebb7f5ca810456 type=CONTAINER_STARTED_EVENT Jul 9 15:04:48.383723 containerd[1558]: time="2025-07-09T15:04:48.383685737Z" level=warning msg="container event discarded" container=e984b94dc19a80c6b56aa5311447b07d619149298d0fd99f7c4acfeff71f8c2e type=CONTAINER_DELETED_EVENT Jul 9 15:04:48.512537 sshd[7038]: Accepted publickey for core from 172.24.4.1 port 51572 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:48.516388 sshd-session[7038]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:48.537913 systemd-logind[1533]: New session 28 of user core. Jul 9 15:04:48.543775 systemd[1]: Started session-28.scope - Session 28 of User core. Jul 9 15:04:48.560376 containerd[1558]: time="2025-07-09T15:04:48.560265751Z" level=warning msg="container event discarded" container=9a40619251fcea779c4b1f650e40a59b01e19b1c1fd6cfceb7e88824d502a39d type=CONTAINER_STARTED_EVENT Jul 9 15:04:48.650885 containerd[1558]: time="2025-07-09T15:04:48.650720880Z" level=warning msg="container event discarded" container=0238e3e59df86882a92a30df035ac24a2ca2739e946949efec2b5fd45de083a4 type=CONTAINER_STARTED_EVENT Jul 9 15:04:49.137517 sshd[7041]: Connection closed by 172.24.4.1 port 51572 Jul 9 15:04:49.136263 sshd-session[7038]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:49.150747 systemd[1]: sshd@25-172.24.4.222:22-172.24.4.1:51572.service: Deactivated successfully. Jul 9 15:04:49.158713 systemd[1]: session-28.scope: Deactivated successfully. Jul 9 15:04:49.161994 systemd-logind[1533]: Session 28 logged out. Waiting for processes to exit. Jul 9 15:04:49.170142 systemd-logind[1533]: Removed session 28. Jul 9 15:04:54.178134 systemd[1]: Started sshd@26-172.24.4.222:22-172.24.4.1:37704.service - OpenSSH per-connection server daemon (172.24.4.1:37704). Jul 9 15:04:55.297195 sshd[7054]: Accepted publickey for core from 172.24.4.1 port 37704 ssh2: RSA SHA256:RpjbNjJETt8jSicFeEb5c+P1rhb51pihPiw0RoN+r6E Jul 9 15:04:55.300136 sshd-session[7054]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 9 15:04:55.311840 systemd-logind[1533]: New session 29 of user core. Jul 9 15:04:55.322850 systemd[1]: Started session-29.scope - Session 29 of User core. Jul 9 15:04:56.253872 sshd[7057]: Connection closed by 172.24.4.1 port 37704 Jul 9 15:04:56.255847 sshd-session[7054]: pam_unix(sshd:session): session closed for user core Jul 9 15:04:56.268380 systemd[1]: sshd@26-172.24.4.222:22-172.24.4.1:37704.service: Deactivated successfully. Jul 9 15:04:56.276702 systemd[1]: session-29.scope: Deactivated successfully. Jul 9 15:04:56.280224 systemd-logind[1533]: Session 29 logged out. Waiting for processes to exit. Jul 9 15:04:56.284899 systemd-logind[1533]: Removed session 29. Jul 9 15:04:56.765428 containerd[1558]: time="2025-07-09T15:04:56.765326074Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a6e565f18035c5bbb74d172d870c88949037f8f2e84510d890388087f4f9ec50\" id:\"8f867f1fcdac336d0bc7a0618c4e04bb8702cf975e067b5a8ee90199adb72187\" pid:7083 exited_at:{seconds:1752073496 nanos:760938941}" Jul 9 15:04:56.810837 containerd[1558]: time="2025-07-09T15:04:56.810782691Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8761ab9568ff0931eb78c90ebbe055d9537f06163865021a13e7092dd185e89d\" id:\"c48d7a0043be945e92deee243b4e177f53a308a0e6a40f46fb72afbe197cd1b1\" pid:7107 exited_at:{seconds:1752073496 nanos:810535958}" Jul 9 15:04:57.934800 containerd[1558]: time="2025-07-09T15:04:57.934598173Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00b51b5f434d54dd5e16b06d15cc030016b471b0aafca79e7762cc0355e0cfcd\" id:\"ba6a1fe0209308a46f1f6bc0bcc0d8d77ae2c8746ca97db3eb229103e40200b3\" pid:7128 exited_at:{seconds:1752073497 nanos:933339831}"