Mar 20 19:15:49.093206 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 20 13:16:44 -00 2025 Mar 20 19:15:49.093232 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=30d38910dcb9abcb2ae1fb8c4b62196472dfae1a70f494441b86ff0de2ee88c9 Mar 20 19:15:49.093242 kernel: BIOS-provided physical RAM map: Mar 20 19:15:49.093250 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 20 19:15:49.093257 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 20 19:15:49.093267 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 20 19:15:49.093276 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Mar 20 19:15:49.093283 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Mar 20 19:15:49.093291 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 20 19:15:49.093299 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 20 19:15:49.093306 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Mar 20 19:15:49.093314 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 20 19:15:49.093321 kernel: NX (Execute Disable) protection: active Mar 20 19:15:49.093329 kernel: APIC: Static calls initialized Mar 20 19:15:49.093340 kernel: SMBIOS 3.0.0 present. Mar 20 19:15:49.094517 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Mar 20 19:15:49.094527 kernel: Hypervisor detected: KVM Mar 20 19:15:49.094535 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 20 19:15:49.094544 kernel: kvm-clock: using sched offset of 3479872696 cycles Mar 20 19:15:49.094552 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 20 19:15:49.094564 kernel: tsc: Detected 1996.249 MHz processor Mar 20 19:15:49.094573 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 20 19:15:49.094582 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 20 19:15:49.094591 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Mar 20 19:15:49.094599 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 20 19:15:49.094608 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 20 19:15:49.094616 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Mar 20 19:15:49.094624 kernel: ACPI: Early table checksum verification disabled Mar 20 19:15:49.094635 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Mar 20 19:15:49.094643 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 19:15:49.094652 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 19:15:49.094660 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 19:15:49.094668 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Mar 20 19:15:49.094676 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 19:15:49.094685 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 20 19:15:49.094693 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Mar 20 19:15:49.094702 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Mar 20 19:15:49.094712 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Mar 20 19:15:49.094720 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Mar 20 19:15:49.094729 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Mar 20 19:15:49.094740 kernel: No NUMA configuration found Mar 20 19:15:49.094749 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Mar 20 19:15:49.094758 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] Mar 20 19:15:49.094767 kernel: Zone ranges: Mar 20 19:15:49.094777 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 20 19:15:49.094786 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 20 19:15:49.094795 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Mar 20 19:15:49.094803 kernel: Movable zone start for each node Mar 20 19:15:49.094812 kernel: Early memory node ranges Mar 20 19:15:49.094820 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 20 19:15:49.094829 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Mar 20 19:15:49.094838 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Mar 20 19:15:49.094848 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Mar 20 19:15:49.094857 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 20 19:15:49.094865 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 20 19:15:49.094874 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Mar 20 19:15:49.094883 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 20 19:15:49.094892 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 20 19:15:49.094901 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 20 19:15:49.094909 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 20 19:15:49.094918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 20 19:15:49.094929 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 20 19:15:49.094937 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 20 19:15:49.094946 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 20 19:15:49.094954 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 20 19:15:49.094963 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 20 19:15:49.094972 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 20 19:15:49.094980 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Mar 20 19:15:49.094989 kernel: Booting paravirtualized kernel on KVM Mar 20 19:15:49.094998 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 20 19:15:49.095008 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 20 19:15:49.095017 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 20 19:15:49.095026 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 20 19:15:49.095034 kernel: pcpu-alloc: [0] 0 1 Mar 20 19:15:49.095043 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 20 19:15:49.095053 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=30d38910dcb9abcb2ae1fb8c4b62196472dfae1a70f494441b86ff0de2ee88c9 Mar 20 19:15:49.095062 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 20 19:15:49.095070 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 20 19:15:49.095081 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 20 19:15:49.095090 kernel: Fallback order for Node 0: 0 Mar 20 19:15:49.095098 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 Mar 20 19:15:49.095107 kernel: Policy zone: Normal Mar 20 19:15:49.095116 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 20 19:15:49.095124 kernel: software IO TLB: area num 2. Mar 20 19:15:49.095133 kernel: Memory: 3962108K/4193772K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 231404K reserved, 0K cma-reserved) Mar 20 19:15:49.095142 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 20 19:15:49.095150 kernel: ftrace: allocating 37985 entries in 149 pages Mar 20 19:15:49.095161 kernel: ftrace: allocated 149 pages with 4 groups Mar 20 19:15:49.095169 kernel: Dynamic Preempt: voluntary Mar 20 19:15:49.095178 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 20 19:15:49.095187 kernel: rcu: RCU event tracing is enabled. Mar 20 19:15:49.095196 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 20 19:15:49.095205 kernel: Trampoline variant of Tasks RCU enabled. Mar 20 19:15:49.095214 kernel: Rude variant of Tasks RCU enabled. Mar 20 19:15:49.095223 kernel: Tracing variant of Tasks RCU enabled. Mar 20 19:15:49.095232 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 20 19:15:49.095242 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 20 19:15:49.095251 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 20 19:15:49.095259 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 20 19:15:49.095268 kernel: Console: colour VGA+ 80x25 Mar 20 19:15:49.095277 kernel: printk: console [tty0] enabled Mar 20 19:15:49.095285 kernel: printk: console [ttyS0] enabled Mar 20 19:15:49.095294 kernel: ACPI: Core revision 20230628 Mar 20 19:15:49.095303 kernel: APIC: Switch to symmetric I/O mode setup Mar 20 19:15:49.095311 kernel: x2apic enabled Mar 20 19:15:49.095322 kernel: APIC: Switched APIC routing to: physical x2apic Mar 20 19:15:49.095330 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 20 19:15:49.095339 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 20 19:15:49.095360 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Mar 20 19:15:49.095370 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 20 19:15:49.095378 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 20 19:15:49.095387 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 20 19:15:49.095396 kernel: Spectre V2 : Mitigation: Retpolines Mar 20 19:15:49.095404 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 20 19:15:49.095416 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 20 19:15:49.095425 kernel: Speculative Store Bypass: Vulnerable Mar 20 19:15:49.095433 kernel: x86/fpu: x87 FPU will use FXSAVE Mar 20 19:15:49.095442 kernel: Freeing SMP alternatives memory: 32K Mar 20 19:15:49.095458 kernel: pid_max: default: 32768 minimum: 301 Mar 20 19:15:49.095469 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 20 19:15:49.095478 kernel: landlock: Up and running. Mar 20 19:15:49.095487 kernel: SELinux: Initializing. Mar 20 19:15:49.095496 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 19:15:49.095505 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 20 19:15:49.095514 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Mar 20 19:15:49.095524 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 19:15:49.095535 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 19:15:49.095545 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 20 19:15:49.095554 kernel: Performance Events: AMD PMU driver. Mar 20 19:15:49.095563 kernel: ... version: 0 Mar 20 19:15:49.095572 kernel: ... bit width: 48 Mar 20 19:15:49.095583 kernel: ... generic registers: 4 Mar 20 19:15:49.095592 kernel: ... value mask: 0000ffffffffffff Mar 20 19:15:49.095601 kernel: ... max period: 00007fffffffffff Mar 20 19:15:49.095610 kernel: ... fixed-purpose events: 0 Mar 20 19:15:49.095619 kernel: ... event mask: 000000000000000f Mar 20 19:15:49.095628 kernel: signal: max sigframe size: 1440 Mar 20 19:15:49.095637 kernel: rcu: Hierarchical SRCU implementation. Mar 20 19:15:49.095647 kernel: rcu: Max phase no-delay instances is 400. Mar 20 19:15:49.095656 kernel: smp: Bringing up secondary CPUs ... Mar 20 19:15:49.095667 kernel: smpboot: x86: Booting SMP configuration: Mar 20 19:15:49.095676 kernel: .... node #0, CPUs: #1 Mar 20 19:15:49.095685 kernel: smp: Brought up 1 node, 2 CPUs Mar 20 19:15:49.095694 kernel: smpboot: Max logical packages: 2 Mar 20 19:15:49.095703 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Mar 20 19:15:49.095712 kernel: devtmpfs: initialized Mar 20 19:15:49.095721 kernel: x86/mm: Memory block size: 128MB Mar 20 19:15:49.095730 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 20 19:15:49.095740 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 20 19:15:49.095751 kernel: pinctrl core: initialized pinctrl subsystem Mar 20 19:15:49.095760 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 20 19:15:49.095769 kernel: audit: initializing netlink subsys (disabled) Mar 20 19:15:49.095778 kernel: audit: type=2000 audit(1742498147.744:1): state=initialized audit_enabled=0 res=1 Mar 20 19:15:49.095787 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 20 19:15:49.095796 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 20 19:15:49.095805 kernel: cpuidle: using governor menu Mar 20 19:15:49.095814 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 20 19:15:49.095823 kernel: dca service started, version 1.12.1 Mar 20 19:15:49.095835 kernel: PCI: Using configuration type 1 for base access Mar 20 19:15:49.095844 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 20 19:15:49.095853 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 20 19:15:49.095862 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 20 19:15:49.095871 kernel: ACPI: Added _OSI(Module Device) Mar 20 19:15:49.095880 kernel: ACPI: Added _OSI(Processor Device) Mar 20 19:15:49.095889 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 20 19:15:49.095898 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 20 19:15:49.095908 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 20 19:15:49.095919 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 20 19:15:49.095928 kernel: ACPI: Interpreter enabled Mar 20 19:15:49.095937 kernel: ACPI: PM: (supports S0 S3 S5) Mar 20 19:15:49.095946 kernel: ACPI: Using IOAPIC for interrupt routing Mar 20 19:15:49.095955 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 20 19:15:49.095964 kernel: PCI: Using E820 reservations for host bridge windows Mar 20 19:15:49.095973 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Mar 20 19:15:49.095982 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 20 19:15:49.096118 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 20 19:15:49.096220 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Mar 20 19:15:49.096311 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Mar 20 19:15:49.096326 kernel: acpiphp: Slot [3] registered Mar 20 19:15:49.096335 kernel: acpiphp: Slot [4] registered Mar 20 19:15:49.096344 kernel: acpiphp: Slot [5] registered Mar 20 19:15:49.099253 kernel: acpiphp: Slot [6] registered Mar 20 19:15:49.099264 kernel: acpiphp: Slot [7] registered Mar 20 19:15:49.099273 kernel: acpiphp: Slot [8] registered Mar 20 19:15:49.099287 kernel: acpiphp: Slot [9] registered Mar 20 19:15:49.099296 kernel: acpiphp: Slot [10] registered Mar 20 19:15:49.099305 kernel: acpiphp: Slot [11] registered Mar 20 19:15:49.099315 kernel: acpiphp: Slot [12] registered Mar 20 19:15:49.099324 kernel: acpiphp: Slot [13] registered Mar 20 19:15:49.099333 kernel: acpiphp: Slot [14] registered Mar 20 19:15:49.099342 kernel: acpiphp: Slot [15] registered Mar 20 19:15:49.099371 kernel: acpiphp: Slot [16] registered Mar 20 19:15:49.099380 kernel: acpiphp: Slot [17] registered Mar 20 19:15:49.099392 kernel: acpiphp: Slot [18] registered Mar 20 19:15:49.099402 kernel: acpiphp: Slot [19] registered Mar 20 19:15:49.099410 kernel: acpiphp: Slot [20] registered Mar 20 19:15:49.099419 kernel: acpiphp: Slot [21] registered Mar 20 19:15:49.099429 kernel: acpiphp: Slot [22] registered Mar 20 19:15:49.099438 kernel: acpiphp: Slot [23] registered Mar 20 19:15:49.099447 kernel: acpiphp: Slot [24] registered Mar 20 19:15:49.099455 kernel: acpiphp: Slot [25] registered Mar 20 19:15:49.099464 kernel: acpiphp: Slot [26] registered Mar 20 19:15:49.099473 kernel: acpiphp: Slot [27] registered Mar 20 19:15:49.099485 kernel: acpiphp: Slot [28] registered Mar 20 19:15:49.099494 kernel: acpiphp: Slot [29] registered Mar 20 19:15:49.099503 kernel: acpiphp: Slot [30] registered Mar 20 19:15:49.099512 kernel: acpiphp: Slot [31] registered Mar 20 19:15:49.099521 kernel: PCI host bridge to bus 0000:00 Mar 20 19:15:49.099626 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 20 19:15:49.099712 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 20 19:15:49.099795 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 20 19:15:49.099883 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 20 19:15:49.099966 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Mar 20 19:15:49.100047 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 20 19:15:49.100164 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Mar 20 19:15:49.100267 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Mar 20 19:15:49.100386 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Mar 20 19:15:49.100490 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Mar 20 19:15:49.100583 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 20 19:15:49.100677 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 20 19:15:49.100768 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 20 19:15:49.100861 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 20 19:15:49.100960 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Mar 20 19:15:49.101053 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Mar 20 19:15:49.101159 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Mar 20 19:15:49.101266 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Mar 20 19:15:49.102454 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Mar 20 19:15:49.102567 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 20 19:15:49.102669 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Mar 20 19:15:49.102771 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Mar 20 19:15:49.102879 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 20 19:15:49.102989 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 20 19:15:49.103089 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Mar 20 19:15:49.103190 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Mar 20 19:15:49.103282 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] Mar 20 19:15:49.103398 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Mar 20 19:15:49.103500 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 20 19:15:49.103601 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 20 19:15:49.103695 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Mar 20 19:15:49.103788 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] Mar 20 19:15:49.103888 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Mar 20 19:15:49.103984 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Mar 20 19:15:49.104077 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] Mar 20 19:15:49.104181 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Mar 20 19:15:49.104281 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Mar 20 19:15:49.106446 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] Mar 20 19:15:49.106586 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] Mar 20 19:15:49.106603 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 20 19:15:49.106615 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 20 19:15:49.106625 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 20 19:15:49.106636 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 20 19:15:49.106647 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 20 19:15:49.106663 kernel: iommu: Default domain type: Translated Mar 20 19:15:49.106674 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 20 19:15:49.106684 kernel: PCI: Using ACPI for IRQ routing Mar 20 19:15:49.106694 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 20 19:15:49.106704 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 20 19:15:49.106715 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Mar 20 19:15:49.106821 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Mar 20 19:15:49.106925 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Mar 20 19:15:49.107027 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 20 19:15:49.107046 kernel: vgaarb: loaded Mar 20 19:15:49.107056 kernel: clocksource: Switched to clocksource kvm-clock Mar 20 19:15:49.107066 kernel: VFS: Disk quotas dquot_6.6.0 Mar 20 19:15:49.107077 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 20 19:15:49.107087 kernel: pnp: PnP ACPI init Mar 20 19:15:49.107196 kernel: pnp 00:03: [dma 2] Mar 20 19:15:49.107212 kernel: pnp: PnP ACPI: found 5 devices Mar 20 19:15:49.107222 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 20 19:15:49.107235 kernel: NET: Registered PF_INET protocol family Mar 20 19:15:49.107244 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 20 19:15:49.107254 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 20 19:15:49.107263 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 20 19:15:49.107272 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 20 19:15:49.107282 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 20 19:15:49.107291 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 20 19:15:49.107300 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 19:15:49.107310 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 20 19:15:49.107320 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 20 19:15:49.107330 kernel: NET: Registered PF_XDP protocol family Mar 20 19:15:49.109447 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 20 19:15:49.109535 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 20 19:15:49.109618 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 20 19:15:49.109701 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Mar 20 19:15:49.109783 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Mar 20 19:15:49.109883 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Mar 20 19:15:49.109987 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 20 19:15:49.110019 kernel: PCI: CLS 0 bytes, default 64 Mar 20 19:15:49.110029 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 20 19:15:49.110038 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) Mar 20 19:15:49.110048 kernel: Initialise system trusted keyrings Mar 20 19:15:49.110057 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 20 19:15:49.110066 kernel: Key type asymmetric registered Mar 20 19:15:49.110075 kernel: Asymmetric key parser 'x509' registered Mar 20 19:15:49.110085 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 20 19:15:49.110098 kernel: io scheduler mq-deadline registered Mar 20 19:15:49.110108 kernel: io scheduler kyber registered Mar 20 19:15:49.110117 kernel: io scheduler bfq registered Mar 20 19:15:49.110126 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 20 19:15:49.110136 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Mar 20 19:15:49.110146 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 20 19:15:49.110156 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Mar 20 19:15:49.110165 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 20 19:15:49.110174 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 20 19:15:49.110186 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 20 19:15:49.110195 kernel: random: crng init done Mar 20 19:15:49.110204 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 20 19:15:49.110213 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 20 19:15:49.110223 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 20 19:15:49.110318 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 20 19:15:49.110333 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 20 19:15:49.110445 kernel: rtc_cmos 00:04: registered as rtc0 Mar 20 19:15:49.110539 kernel: rtc_cmos 00:04: setting system clock to 2025-03-20T19:15:48 UTC (1742498148) Mar 20 19:15:49.110624 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 20 19:15:49.110639 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 20 19:15:49.110648 kernel: NET: Registered PF_INET6 protocol family Mar 20 19:15:49.110658 kernel: Segment Routing with IPv6 Mar 20 19:15:49.110667 kernel: In-situ OAM (IOAM) with IPv6 Mar 20 19:15:49.110676 kernel: NET: Registered PF_PACKET protocol family Mar 20 19:15:49.110685 kernel: Key type dns_resolver registered Mar 20 19:15:49.110694 kernel: IPI shorthand broadcast: enabled Mar 20 19:15:49.110707 kernel: sched_clock: Marking stable (1002007953, 170924867)->(1233892115, -60959295) Mar 20 19:15:49.110716 kernel: registered taskstats version 1 Mar 20 19:15:49.110725 kernel: Loading compiled-in X.509 certificates Mar 20 19:15:49.110735 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: 2c0605e0441a1fddfb1f70673dce1f0d470be9b5' Mar 20 19:15:49.110744 kernel: Key type .fscrypt registered Mar 20 19:15:49.110753 kernel: Key type fscrypt-provisioning registered Mar 20 19:15:49.110762 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 20 19:15:49.110772 kernel: ima: Allocated hash algorithm: sha1 Mar 20 19:15:49.110782 kernel: ima: No architecture policies found Mar 20 19:15:49.110792 kernel: clk: Disabling unused clocks Mar 20 19:15:49.110801 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 20 19:15:49.110810 kernel: Write protecting the kernel read-only data: 40960k Mar 20 19:15:49.110819 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 20 19:15:49.110828 kernel: Run /init as init process Mar 20 19:15:49.110838 kernel: with arguments: Mar 20 19:15:49.110847 kernel: /init Mar 20 19:15:49.110856 kernel: with environment: Mar 20 19:15:49.110865 kernel: HOME=/ Mar 20 19:15:49.110876 kernel: TERM=linux Mar 20 19:15:49.110885 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 20 19:15:49.110896 systemd[1]: Successfully made /usr/ read-only. Mar 20 19:15:49.110909 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 19:15:49.110920 systemd[1]: Detected virtualization kvm. Mar 20 19:15:49.110930 systemd[1]: Detected architecture x86-64. Mar 20 19:15:49.110940 systemd[1]: Running in initrd. Mar 20 19:15:49.110951 systemd[1]: No hostname configured, using default hostname. Mar 20 19:15:49.110962 systemd[1]: Hostname set to . Mar 20 19:15:49.110971 systemd[1]: Initializing machine ID from VM UUID. Mar 20 19:15:49.110981 systemd[1]: Queued start job for default target initrd.target. Mar 20 19:15:49.110991 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 19:15:49.111002 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 19:15:49.111020 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 20 19:15:49.111032 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 19:15:49.111044 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 20 19:15:49.111055 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 20 19:15:49.111066 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 20 19:15:49.111077 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 20 19:15:49.111088 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 19:15:49.111099 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 19:15:49.111109 systemd[1]: Reached target paths.target - Path Units. Mar 20 19:15:49.111119 systemd[1]: Reached target slices.target - Slice Units. Mar 20 19:15:49.111129 systemd[1]: Reached target swap.target - Swaps. Mar 20 19:15:49.111139 systemd[1]: Reached target timers.target - Timer Units. Mar 20 19:15:49.111149 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 19:15:49.111159 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 19:15:49.111169 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 20 19:15:49.111181 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 20 19:15:49.111192 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 19:15:49.111202 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 19:15:49.111212 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 19:15:49.111222 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 19:15:49.111232 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 20 19:15:49.111242 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 19:15:49.111252 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 20 19:15:49.111264 systemd[1]: Starting systemd-fsck-usr.service... Mar 20 19:15:49.111274 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 19:15:49.111284 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 19:15:49.111295 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 19:15:49.111305 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 20 19:15:49.111315 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 19:15:49.111327 systemd[1]: Finished systemd-fsck-usr.service. Mar 20 19:15:49.111338 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 19:15:49.111391 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 19:15:49.111426 systemd-journald[184]: Collecting audit messages is disabled. Mar 20 19:15:49.111457 systemd-journald[184]: Journal started Mar 20 19:15:49.111480 systemd-journald[184]: Runtime Journal (/run/log/journal/390ab905d6c740e9a6f8ddced2aeaaed) is 8M, max 78.2M, 70.2M free. Mar 20 19:15:49.101389 systemd-modules-load[186]: Inserted module 'overlay' Mar 20 19:15:49.124480 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 19:15:49.125665 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:15:49.130463 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 19:15:49.134630 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 20 19:15:49.136449 kernel: Bridge firewalling registered Mar 20 19:15:49.136426 systemd-modules-load[186]: Inserted module 'br_netfilter' Mar 20 19:15:49.138901 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 19:15:49.142473 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 19:15:49.150895 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 19:15:49.159506 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 19:15:49.161206 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 19:15:49.167484 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 19:15:49.169014 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 19:15:49.172528 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 20 19:15:49.173223 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 19:15:49.191455 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 19:15:49.199870 dracut-cmdline[219]: dracut-dracut-053 Mar 20 19:15:49.204840 dracut-cmdline[219]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=30d38910dcb9abcb2ae1fb8c4b62196472dfae1a70f494441b86ff0de2ee88c9 Mar 20 19:15:49.240232 systemd-resolved[221]: Positive Trust Anchors: Mar 20 19:15:49.240244 systemd-resolved[221]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 19:15:49.240286 systemd-resolved[221]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 19:15:49.247205 systemd-resolved[221]: Defaulting to hostname 'linux'. Mar 20 19:15:49.248143 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 19:15:49.248699 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 19:15:49.280420 kernel: SCSI subsystem initialized Mar 20 19:15:49.290453 kernel: Loading iSCSI transport class v2.0-870. Mar 20 19:15:49.302407 kernel: iscsi: registered transport (tcp) Mar 20 19:15:49.324666 kernel: iscsi: registered transport (qla4xxx) Mar 20 19:15:49.324729 kernel: QLogic iSCSI HBA Driver Mar 20 19:15:49.379343 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 20 19:15:49.382024 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 20 19:15:49.442154 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 20 19:15:49.442254 kernel: device-mapper: uevent: version 1.0.3 Mar 20 19:15:49.444721 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 20 19:15:49.503467 kernel: raid6: sse2x4 gen() 6594 MB/s Mar 20 19:15:49.521441 kernel: raid6: sse2x2 gen() 8018 MB/s Mar 20 19:15:49.539811 kernel: raid6: sse2x1 gen() 9730 MB/s Mar 20 19:15:49.539871 kernel: raid6: using algorithm sse2x1 gen() 9730 MB/s Mar 20 19:15:49.558773 kernel: raid6: .... xor() 7341 MB/s, rmw enabled Mar 20 19:15:49.558833 kernel: raid6: using ssse3x2 recovery algorithm Mar 20 19:15:49.580431 kernel: xor: measuring software checksum speed Mar 20 19:15:49.582896 kernel: prefetch64-sse : 17209 MB/sec Mar 20 19:15:49.582956 kernel: generic_sse : 15746 MB/sec Mar 20 19:15:49.582985 kernel: xor: using function: prefetch64-sse (17209 MB/sec) Mar 20 19:15:49.756417 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 20 19:15:49.771538 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 20 19:15:49.777102 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 19:15:49.804121 systemd-udevd[404]: Using default interface naming scheme 'v255'. Mar 20 19:15:49.809749 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 19:15:49.814320 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 20 19:15:49.849668 dracut-pre-trigger[415]: rd.md=0: removing MD RAID activation Mar 20 19:15:49.891666 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 19:15:49.894476 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 19:15:49.948795 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 19:15:49.957577 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 20 19:15:50.001548 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 20 19:15:50.007032 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 19:15:50.009291 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 19:15:50.011235 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 19:15:50.015469 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 20 19:15:50.034371 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Mar 20 19:15:50.078772 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Mar 20 19:15:50.078904 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 20 19:15:50.078927 kernel: GPT:17805311 != 20971519 Mar 20 19:15:50.078940 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 20 19:15:50.078954 kernel: GPT:17805311 != 20971519 Mar 20 19:15:50.078967 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 20 19:15:50.078980 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 19:15:50.040953 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 20 19:15:50.082550 kernel: libata version 3.00 loaded. Mar 20 19:15:50.083761 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 19:15:50.088039 kernel: ata_piix 0000:00:01.1: version 2.13 Mar 20 19:15:50.093531 kernel: scsi host0: ata_piix Mar 20 19:15:50.093652 kernel: scsi host1: ata_piix Mar 20 19:15:50.093771 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Mar 20 19:15:50.093785 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Mar 20 19:15:50.084865 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 19:15:50.092484 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 19:15:50.092979 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 19:15:50.093025 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:15:50.094693 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 19:15:50.096677 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 19:15:50.097399 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 20 19:15:50.125372 kernel: BTRFS: device fsid 5af3bf9c-0d36-4793-88d6-028c3ca48c10 devid 1 transid 38 /dev/vda3 scanned by (udev-worker) (460) Mar 20 19:15:50.131388 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (456) Mar 20 19:15:50.161621 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 20 19:15:50.173050 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:15:50.184943 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 20 19:15:50.193648 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 20 19:15:50.194212 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 20 19:15:50.205858 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 19:15:50.209460 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 20 19:15:50.210767 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 20 19:15:50.235901 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 19:15:50.239582 disk-uuid[506]: Primary Header is updated. Mar 20 19:15:50.239582 disk-uuid[506]: Secondary Entries is updated. Mar 20 19:15:50.239582 disk-uuid[506]: Secondary Header is updated. Mar 20 19:15:50.249376 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 19:15:51.270420 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 20 19:15:51.271176 disk-uuid[513]: The operation has completed successfully. Mar 20 19:15:51.356935 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 20 19:15:51.357035 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 20 19:15:51.405189 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 20 19:15:51.425223 sh[525]: Success Mar 20 19:15:51.439646 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Mar 20 19:15:51.509414 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 20 19:15:51.514453 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 20 19:15:51.519487 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 20 19:15:51.543419 kernel: BTRFS info (device dm-0): first mount of filesystem 5af3bf9c-0d36-4793-88d6-028c3ca48c10 Mar 20 19:15:51.543503 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 20 19:15:51.543528 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 20 19:15:51.545475 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 20 19:15:51.547945 kernel: BTRFS info (device dm-0): using free space tree Mar 20 19:15:51.565045 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 20 19:15:51.567236 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 20 19:15:51.570441 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 20 19:15:51.577571 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 20 19:15:51.620276 kernel: BTRFS info (device vda6): first mount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 19:15:51.620403 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 19:15:51.624912 kernel: BTRFS info (device vda6): using free space tree Mar 20 19:15:51.638444 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 19:15:51.650617 kernel: BTRFS info (device vda6): last unmount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 19:15:51.659063 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 20 19:15:51.664342 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 20 19:15:51.733146 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 19:15:51.739461 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 19:15:51.781586 systemd-networkd[706]: lo: Link UP Mar 20 19:15:51.781594 systemd-networkd[706]: lo: Gained carrier Mar 20 19:15:51.783941 systemd-networkd[706]: Enumeration completed Mar 20 19:15:51.784302 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 19:15:51.784752 systemd-networkd[706]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 19:15:51.784756 systemd-networkd[706]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 19:15:51.785895 systemd[1]: Reached target network.target - Network. Mar 20 19:15:51.787997 systemd-networkd[706]: eth0: Link UP Mar 20 19:15:51.788001 systemd-networkd[706]: eth0: Gained carrier Mar 20 19:15:51.788008 systemd-networkd[706]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 19:15:51.801625 systemd-networkd[706]: eth0: DHCPv4 address 172.24.4.12/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 20 19:15:51.812675 ignition[627]: Ignition 2.20.0 Mar 20 19:15:51.812687 ignition[627]: Stage: fetch-offline Mar 20 19:15:51.814834 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 19:15:51.812716 ignition[627]: no configs at "/usr/lib/ignition/base.d" Mar 20 19:15:51.812726 ignition[627]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:15:51.812815 ignition[627]: parsed url from cmdline: "" Mar 20 19:15:51.812819 ignition[627]: no config URL provided Mar 20 19:15:51.812825 ignition[627]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 19:15:51.812832 ignition[627]: no config at "/usr/lib/ignition/user.ign" Mar 20 19:15:51.818491 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 20 19:15:51.812837 ignition[627]: failed to fetch config: resource requires networking Mar 20 19:15:51.813011 ignition[627]: Ignition finished successfully Mar 20 19:15:51.840129 ignition[717]: Ignition 2.20.0 Mar 20 19:15:51.840845 ignition[717]: Stage: fetch Mar 20 19:15:51.841015 ignition[717]: no configs at "/usr/lib/ignition/base.d" Mar 20 19:15:51.841027 ignition[717]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:15:51.841117 ignition[717]: parsed url from cmdline: "" Mar 20 19:15:51.841121 ignition[717]: no config URL provided Mar 20 19:15:51.841127 ignition[717]: reading system config file "/usr/lib/ignition/user.ign" Mar 20 19:15:51.841135 ignition[717]: no config at "/usr/lib/ignition/user.ign" Mar 20 19:15:51.841221 ignition[717]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 20 19:15:51.841250 ignition[717]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 20 19:15:51.841256 ignition[717]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 20 19:15:52.257860 ignition[717]: GET result: OK Mar 20 19:15:52.258407 ignition[717]: parsing config with SHA512: c283e834cda8cfec16dc67f3c18a9109d4ee70aab40e3b0e84eafe5cd92a268dc3971a4aebdf503a00ee81fd2548b628344a09b79f7003d667b8052fd15a74f0 Mar 20 19:15:52.272048 unknown[717]: fetched base config from "system" Mar 20 19:15:52.274331 unknown[717]: fetched base config from "system" Mar 20 19:15:52.274388 unknown[717]: fetched user config from "openstack" Mar 20 19:15:52.275650 ignition[717]: fetch: fetch complete Mar 20 19:15:52.275663 ignition[717]: fetch: fetch passed Mar 20 19:15:52.279021 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 20 19:15:52.275787 ignition[717]: Ignition finished successfully Mar 20 19:15:52.283606 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 20 19:15:52.331117 ignition[724]: Ignition 2.20.0 Mar 20 19:15:52.331146 ignition[724]: Stage: kargs Mar 20 19:15:52.331604 ignition[724]: no configs at "/usr/lib/ignition/base.d" Mar 20 19:15:52.331630 ignition[724]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:15:52.336527 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 20 19:15:52.333936 ignition[724]: kargs: kargs passed Mar 20 19:15:52.334061 ignition[724]: Ignition finished successfully Mar 20 19:15:52.343301 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 20 19:15:52.379091 ignition[731]: Ignition 2.20.0 Mar 20 19:15:52.379117 ignition[731]: Stage: disks Mar 20 19:15:52.379563 ignition[731]: no configs at "/usr/lib/ignition/base.d" Mar 20 19:15:52.379590 ignition[731]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:15:52.387814 ignition[731]: disks: disks passed Mar 20 19:15:52.387908 ignition[731]: Ignition finished successfully Mar 20 19:15:52.389862 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 20 19:15:52.392714 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 20 19:15:52.394642 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 20 19:15:52.397571 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 19:15:52.400474 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 19:15:52.402927 systemd[1]: Reached target basic.target - Basic System. Mar 20 19:15:52.407570 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 20 19:15:52.452952 systemd-fsck[739]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 20 19:15:52.464194 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 20 19:15:52.468609 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 20 19:15:52.635934 kernel: EXT4-fs (vda9): mounted filesystem bf9c440e-9fee-4e54-8539-b83f5a9eea2f r/w with ordered data mode. Quota mode: none. Mar 20 19:15:52.636263 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 20 19:15:52.637216 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 20 19:15:52.640425 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 19:15:52.643987 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 20 19:15:52.644695 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 20 19:15:52.646490 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 20 19:15:52.647748 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 20 19:15:52.647776 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 19:15:52.662184 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 20 19:15:52.671757 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 20 19:15:52.686410 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (747) Mar 20 19:15:52.709907 kernel: BTRFS info (device vda6): first mount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 19:15:52.709961 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 19:15:52.710004 kernel: BTRFS info (device vda6): using free space tree Mar 20 19:15:52.721371 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 19:15:52.728260 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 19:15:52.777605 initrd-setup-root[775]: cut: /sysroot/etc/passwd: No such file or directory Mar 20 19:15:52.781727 initrd-setup-root[782]: cut: /sysroot/etc/group: No such file or directory Mar 20 19:15:52.787487 initrd-setup-root[789]: cut: /sysroot/etc/shadow: No such file or directory Mar 20 19:15:52.791828 initrd-setup-root[796]: cut: /sysroot/etc/gshadow: No such file or directory Mar 20 19:15:52.875512 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 20 19:15:52.877203 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 20 19:15:52.879515 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 20 19:15:52.889800 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 20 19:15:52.893049 kernel: BTRFS info (device vda6): last unmount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 19:15:52.914115 ignition[863]: INFO : Ignition 2.20.0 Mar 20 19:15:52.915952 ignition[863]: INFO : Stage: mount Mar 20 19:15:52.915952 ignition[863]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 19:15:52.915952 ignition[863]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:15:52.915269 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 20 19:15:52.919485 ignition[863]: INFO : mount: mount passed Mar 20 19:15:52.919485 ignition[863]: INFO : Ignition finished successfully Mar 20 19:15:52.917411 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 20 19:15:53.565673 systemd-networkd[706]: eth0: Gained IPv6LL Mar 20 19:15:59.823976 coreos-metadata[749]: Mar 20 19:15:59.823 WARN failed to locate config-drive, using the metadata service API instead Mar 20 19:15:59.865922 coreos-metadata[749]: Mar 20 19:15:59.865 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 20 19:15:59.880724 coreos-metadata[749]: Mar 20 19:15:59.880 INFO Fetch successful Mar 20 19:15:59.883579 coreos-metadata[749]: Mar 20 19:15:59.883 INFO wrote hostname ci-9999-0-1-1-f6fba67404.novalocal to /sysroot/etc/hostname Mar 20 19:15:59.885839 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 20 19:15:59.886125 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 20 19:15:59.893617 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 20 19:15:59.919594 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 20 19:15:59.952428 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (881) Mar 20 19:15:59.961608 kernel: BTRFS info (device vda6): first mount of filesystem d877ba4c-bfdd-4ad4-94ef-51dbb6b505e4 Mar 20 19:15:59.961672 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 20 19:15:59.965800 kernel: BTRFS info (device vda6): using free space tree Mar 20 19:15:59.977460 kernel: BTRFS info (device vda6): auto enabling async discard Mar 20 19:15:59.982727 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 20 19:16:00.027396 ignition[899]: INFO : Ignition 2.20.0 Mar 20 19:16:00.027396 ignition[899]: INFO : Stage: files Mar 20 19:16:00.027396 ignition[899]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 19:16:00.027396 ignition[899]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:16:00.034552 ignition[899]: DEBUG : files: compiled without relabeling support, skipping Mar 20 19:16:00.034552 ignition[899]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 20 19:16:00.034552 ignition[899]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 20 19:16:00.039968 ignition[899]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 20 19:16:00.039968 ignition[899]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 20 19:16:00.043679 ignition[899]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 20 19:16:00.040345 unknown[899]: wrote ssh authorized keys file for user: core Mar 20 19:16:00.047293 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 19:16:00.047293 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 20 19:16:00.307184 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 20 19:16:01.312441 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 20 19:16:01.312441 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 19:16:01.317886 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 20 19:16:01.853727 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 20 19:16:03.411935 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 20 19:16:03.411935 ignition[899]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 20 19:16:03.418689 ignition[899]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 19:16:03.418689 ignition[899]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 20 19:16:03.418689 ignition[899]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 20 19:16:03.418689 ignition[899]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 20 19:16:03.418689 ignition[899]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 20 19:16:03.418689 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 20 19:16:03.418689 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 20 19:16:03.418689 ignition[899]: INFO : files: files passed Mar 20 19:16:03.418689 ignition[899]: INFO : Ignition finished successfully Mar 20 19:16:03.414972 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 20 19:16:03.419679 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 20 19:16:03.423459 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 20 19:16:03.441751 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 20 19:16:03.441855 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 20 19:16:03.447441 initrd-setup-root-after-ignition[927]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 19:16:03.447441 initrd-setup-root-after-ignition[927]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 20 19:16:03.449887 initrd-setup-root-after-ignition[932]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 20 19:16:03.452562 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 19:16:03.453288 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 20 19:16:03.456485 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 20 19:16:03.501078 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 20 19:16:03.501277 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 20 19:16:03.503820 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 20 19:16:03.504737 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 20 19:16:03.517433 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 20 19:16:03.519001 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 20 19:16:03.560212 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 19:16:03.565572 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 20 19:16:03.606150 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 20 19:16:03.607858 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 19:16:03.611005 systemd[1]: Stopped target timers.target - Timer Units. Mar 20 19:16:03.613805 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 20 19:16:03.614128 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 20 19:16:03.617042 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 20 19:16:03.618868 systemd[1]: Stopped target basic.target - Basic System. Mar 20 19:16:03.621890 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 20 19:16:03.624495 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 20 19:16:03.627012 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 20 19:16:03.629936 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 20 19:16:03.635671 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 20 19:16:03.637416 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 20 19:16:03.640203 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 20 19:16:03.643140 systemd[1]: Stopped target swap.target - Swaps. Mar 20 19:16:03.645707 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 20 19:16:03.646140 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 20 19:16:03.648969 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 20 19:16:03.650813 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 19:16:03.653253 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 20 19:16:03.654166 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 19:16:03.656324 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 20 19:16:03.656657 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 20 19:16:03.660528 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 20 19:16:03.660873 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 20 19:16:03.663833 systemd[1]: ignition-files.service: Deactivated successfully. Mar 20 19:16:03.664101 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 20 19:16:03.670796 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 20 19:16:03.676782 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 20 19:16:03.680250 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 20 19:16:03.680608 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 19:16:03.683966 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 20 19:16:03.684125 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 20 19:16:03.692492 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 20 19:16:03.692573 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 20 19:16:03.707925 ignition[952]: INFO : Ignition 2.20.0 Mar 20 19:16:03.707925 ignition[952]: INFO : Stage: umount Mar 20 19:16:03.709405 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 20 19:16:03.709405 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 20 19:16:03.709405 ignition[952]: INFO : umount: umount passed Mar 20 19:16:03.709405 ignition[952]: INFO : Ignition finished successfully Mar 20 19:16:03.710469 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 20 19:16:03.710590 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 20 19:16:03.712036 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 20 19:16:03.712083 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 20 19:16:03.712692 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 20 19:16:03.712732 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 20 19:16:03.713647 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 20 19:16:03.713686 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 20 19:16:03.714593 systemd[1]: Stopped target network.target - Network. Mar 20 19:16:03.715573 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 20 19:16:03.715618 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 20 19:16:03.717675 systemd[1]: Stopped target paths.target - Path Units. Mar 20 19:16:03.718582 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 20 19:16:03.718833 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 19:16:03.719865 systemd[1]: Stopped target slices.target - Slice Units. Mar 20 19:16:03.720909 systemd[1]: Stopped target sockets.target - Socket Units. Mar 20 19:16:03.722545 systemd[1]: iscsid.socket: Deactivated successfully. Mar 20 19:16:03.722579 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 20 19:16:03.723048 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 20 19:16:03.723078 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 20 19:16:03.723566 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 20 19:16:03.723608 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 20 19:16:03.724092 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 20 19:16:03.724130 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 20 19:16:03.725193 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 20 19:16:03.726213 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 20 19:16:03.728336 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 20 19:16:03.728896 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 20 19:16:03.728983 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 20 19:16:03.731012 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 20 19:16:03.731079 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 20 19:16:03.732203 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 20 19:16:03.732300 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 20 19:16:03.735421 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 20 19:16:03.735915 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 20 19:16:03.735974 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 19:16:03.739387 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 20 19:16:03.739612 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 20 19:16:03.739734 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 20 19:16:03.741541 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 20 19:16:03.742047 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 20 19:16:03.742234 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 20 19:16:03.744444 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 20 19:16:03.745674 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 20 19:16:03.745723 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 20 19:16:03.748450 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 20 19:16:03.748493 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 20 19:16:03.749684 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 20 19:16:03.749725 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 20 19:16:03.751462 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 19:16:03.759342 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 20 19:16:03.762597 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 20 19:16:03.762737 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 19:16:03.764711 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 20 19:16:03.764764 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 20 19:16:03.766145 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 20 19:16:03.766176 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 19:16:03.767319 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 20 19:16:03.767417 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 20 19:16:03.768817 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 20 19:16:03.768858 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 20 19:16:03.769757 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 20 19:16:03.769801 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 20 19:16:03.771451 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 20 19:16:03.773463 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 20 19:16:03.773513 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 19:16:03.775033 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 20 19:16:03.775077 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 19:16:03.776236 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 20 19:16:03.776278 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 19:16:03.777418 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 19:16:03.777461 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:16:03.783598 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 20 19:16:03.783683 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 20 19:16:03.787604 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 20 19:16:03.787695 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 20 19:16:03.788982 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 20 19:16:03.790458 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 20 19:16:03.811528 systemd[1]: Switching root. Mar 20 19:16:03.846432 systemd-journald[184]: Journal stopped Mar 20 19:16:05.189622 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). Mar 20 19:16:05.189676 kernel: SELinux: policy capability network_peer_controls=1 Mar 20 19:16:05.189701 kernel: SELinux: policy capability open_perms=1 Mar 20 19:16:05.189716 kernel: SELinux: policy capability extended_socket_class=1 Mar 20 19:16:05.189728 kernel: SELinux: policy capability always_check_network=0 Mar 20 19:16:05.189740 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 20 19:16:05.189754 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 20 19:16:05.189766 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 20 19:16:05.189778 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 20 19:16:05.189789 kernel: audit: type=1403 audit(1742498164.204:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 20 19:16:05.189802 systemd[1]: Successfully loaded SELinux policy in 44.039ms. Mar 20 19:16:05.189821 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.679ms. Mar 20 19:16:05.189835 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 20 19:16:05.189848 systemd[1]: Detected virtualization kvm. Mar 20 19:16:05.189863 systemd[1]: Detected architecture x86-64. Mar 20 19:16:05.189875 systemd[1]: Detected first boot. Mar 20 19:16:05.189887 systemd[1]: Hostname set to . Mar 20 19:16:05.189900 systemd[1]: Initializing machine ID from VM UUID. Mar 20 19:16:05.189912 zram_generator::config[998]: No configuration found. Mar 20 19:16:05.189925 kernel: Guest personality initialized and is inactive Mar 20 19:16:05.189936 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 20 19:16:05.189957 kernel: Initialized host personality Mar 20 19:16:05.189970 kernel: NET: Registered PF_VSOCK protocol family Mar 20 19:16:05.189982 systemd[1]: Populated /etc with preset unit settings. Mar 20 19:16:05.189995 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 20 19:16:05.190008 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 20 19:16:05.190020 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 20 19:16:05.190032 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 20 19:16:05.190044 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 20 19:16:05.190057 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 20 19:16:05.190069 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 20 19:16:05.190087 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 20 19:16:05.190100 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 20 19:16:05.190112 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 20 19:16:05.190128 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 20 19:16:05.190140 systemd[1]: Created slice user.slice - User and Session Slice. Mar 20 19:16:05.190152 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 20 19:16:05.190165 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 20 19:16:05.190177 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 20 19:16:05.190190 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 20 19:16:05.190206 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 20 19:16:05.190219 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 20 19:16:05.190232 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 20 19:16:05.190244 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 20 19:16:05.190256 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 20 19:16:05.190269 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 20 19:16:05.190283 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 20 19:16:05.190296 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 20 19:16:05.190308 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 20 19:16:05.190320 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 20 19:16:05.190333 systemd[1]: Reached target slices.target - Slice Units. Mar 20 19:16:05.198595 systemd[1]: Reached target swap.target - Swaps. Mar 20 19:16:05.198644 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 20 19:16:05.198657 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 20 19:16:05.198671 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 20 19:16:05.198688 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 20 19:16:05.198702 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 20 19:16:05.198714 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 20 19:16:05.198727 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 20 19:16:05.198739 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 20 19:16:05.198751 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 20 19:16:05.198764 systemd[1]: Mounting media.mount - External Media Directory... Mar 20 19:16:05.198776 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:05.198788 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 20 19:16:05.198802 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 20 19:16:05.198815 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 20 19:16:05.198828 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 20 19:16:05.198841 systemd[1]: Reached target machines.target - Containers. Mar 20 19:16:05.198853 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 20 19:16:05.198866 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 19:16:05.198878 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 20 19:16:05.198890 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 20 19:16:05.198905 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 19:16:05.198917 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 19:16:05.198929 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 19:16:05.198941 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 20 19:16:05.198954 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 19:16:05.198966 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 20 19:16:05.198978 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 20 19:16:05.198991 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 20 19:16:05.199005 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 20 19:16:05.199017 systemd[1]: Stopped systemd-fsck-usr.service. Mar 20 19:16:05.199030 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 19:16:05.199043 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 20 19:16:05.199055 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 20 19:16:05.199067 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 20 19:16:05.199080 kernel: loop: module loaded Mar 20 19:16:05.199092 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 20 19:16:05.199105 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 20 19:16:05.199119 kernel: fuse: init (API version 7.39) Mar 20 19:16:05.199131 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 20 19:16:05.199143 systemd[1]: verity-setup.service: Deactivated successfully. Mar 20 19:16:05.199155 systemd[1]: Stopped verity-setup.service. Mar 20 19:16:05.199168 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:05.199182 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 20 19:16:05.199195 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 20 19:16:05.199209 systemd[1]: Mounted media.mount - External Media Directory. Mar 20 19:16:05.199221 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 20 19:16:05.199234 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 20 19:16:05.199247 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 20 19:16:05.199288 systemd-journald[1088]: Collecting audit messages is disabled. Mar 20 19:16:05.199318 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 20 19:16:05.199333 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 20 19:16:05.204772 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 20 19:16:05.204801 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 19:16:05.204814 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 19:16:05.204827 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 19:16:05.204839 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 19:16:05.204851 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 20 19:16:05.204868 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 20 19:16:05.204882 systemd-journald[1088]: Journal started Mar 20 19:16:05.204913 systemd-journald[1088]: Runtime Journal (/run/log/journal/390ab905d6c740e9a6f8ddced2aeaaed) is 8M, max 78.2M, 70.2M free. Mar 20 19:16:04.854696 systemd[1]: Queued start job for default target multi-user.target. Mar 20 19:16:04.863467 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 20 19:16:04.863896 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 20 19:16:05.217841 systemd[1]: Started systemd-journald.service - Journal Service. Mar 20 19:16:05.209395 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 19:16:05.211414 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 19:16:05.212262 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 20 19:16:05.213037 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 20 19:16:05.213832 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 20 19:16:05.222145 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 20 19:16:05.227129 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 20 19:16:05.230723 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 20 19:16:05.235430 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 20 19:16:05.247377 kernel: ACPI: bus type drm_connector registered Mar 20 19:16:05.244443 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 20 19:16:05.245013 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 20 19:16:05.245044 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 20 19:16:05.246860 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 20 19:16:05.252465 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 20 19:16:05.254697 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 20 19:16:05.256505 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 19:16:05.259717 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 20 19:16:05.265499 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 20 19:16:05.266137 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 19:16:05.268194 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 20 19:16:05.270464 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 19:16:05.271630 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 20 19:16:05.287680 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 20 19:16:05.291873 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 20 19:16:05.295342 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 19:16:05.296200 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 19:16:05.297010 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 20 19:16:05.297854 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 20 19:16:05.298917 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 20 19:16:05.299793 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 20 19:16:05.302707 systemd-journald[1088]: Time spent on flushing to /var/log/journal/390ab905d6c740e9a6f8ddced2aeaaed is 72.535ms for 959 entries. Mar 20 19:16:05.302707 systemd-journald[1088]: System Journal (/var/log/journal/390ab905d6c740e9a6f8ddced2aeaaed) is 8M, max 584.8M, 576.8M free. Mar 20 19:16:05.409536 systemd-journald[1088]: Received client request to flush runtime journal. Mar 20 19:16:05.409577 kernel: loop0: detected capacity change from 0 to 151640 Mar 20 19:16:05.305629 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 20 19:16:05.319415 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 20 19:16:05.320198 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 20 19:16:05.323222 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 20 19:16:05.341757 udevadm[1144]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 20 19:16:05.346841 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 20 19:16:05.411684 systemd-tmpfiles[1137]: ACLs are not supported, ignoring. Mar 20 19:16:05.411698 systemd-tmpfiles[1137]: ACLs are not supported, ignoring. Mar 20 19:16:05.413341 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 20 19:16:05.418097 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 20 19:16:05.419844 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 20 19:16:05.445616 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 20 19:16:05.479372 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 20 19:16:05.509390 kernel: loop1: detected capacity change from 0 to 109808 Mar 20 19:16:05.521052 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 20 19:16:05.526828 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 20 19:16:05.559375 kernel: loop2: detected capacity change from 0 to 210664 Mar 20 19:16:05.560884 systemd-tmpfiles[1160]: ACLs are not supported, ignoring. Mar 20 19:16:05.560903 systemd-tmpfiles[1160]: ACLs are not supported, ignoring. Mar 20 19:16:05.568620 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 20 19:16:05.628103 kernel: loop3: detected capacity change from 0 to 8 Mar 20 19:16:05.655383 kernel: loop4: detected capacity change from 0 to 151640 Mar 20 19:16:05.703534 kernel: loop5: detected capacity change from 0 to 109808 Mar 20 19:16:05.750411 kernel: loop6: detected capacity change from 0 to 210664 Mar 20 19:16:05.797405 kernel: loop7: detected capacity change from 0 to 8 Mar 20 19:16:05.797525 (sd-merge)[1166]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 20 19:16:05.798767 (sd-merge)[1166]: Merged extensions into '/usr'. Mar 20 19:16:05.804411 systemd[1]: Reload requested from client PID 1136 ('systemd-sysext') (unit systemd-sysext.service)... Mar 20 19:16:05.804424 systemd[1]: Reloading... Mar 20 19:16:05.904372 zram_generator::config[1194]: No configuration found. Mar 20 19:16:06.066065 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 19:16:06.146678 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 20 19:16:06.147013 systemd[1]: Reloading finished in 342 ms. Mar 20 19:16:06.168089 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 20 19:16:06.179189 systemd[1]: Starting ensure-sysext.service... Mar 20 19:16:06.181293 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 20 19:16:06.211442 systemd[1]: Reload requested from client PID 1249 ('systemctl') (unit ensure-sysext.service)... Mar 20 19:16:06.211458 systemd[1]: Reloading... Mar 20 19:16:06.227880 systemd-tmpfiles[1250]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 20 19:16:06.228850 systemd-tmpfiles[1250]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 20 19:16:06.229798 systemd-tmpfiles[1250]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 20 19:16:06.230277 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Mar 20 19:16:06.230450 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Mar 20 19:16:06.243702 systemd-tmpfiles[1250]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 19:16:06.243714 systemd-tmpfiles[1250]: Skipping /boot Mar 20 19:16:06.259100 systemd-tmpfiles[1250]: Detected autofs mount point /boot during canonicalization of boot. Mar 20 19:16:06.259372 systemd-tmpfiles[1250]: Skipping /boot Mar 20 19:16:06.294431 zram_generator::config[1280]: No configuration found. Mar 20 19:16:06.310544 ldconfig[1131]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 20 19:16:06.438985 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 19:16:06.519696 systemd[1]: Reloading finished in 307 ms. Mar 20 19:16:06.537894 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 20 19:16:06.540027 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 20 19:16:06.555642 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 20 19:16:06.575614 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 19:16:06.585647 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 20 19:16:06.589838 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 20 19:16:06.601611 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 20 19:16:06.607554 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 20 19:16:06.611624 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 20 19:16:06.616208 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:06.616573 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 19:16:06.622810 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 20 19:16:06.627967 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 20 19:16:06.630258 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 20 19:16:06.631391 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 19:16:06.631506 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 19:16:06.637770 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 20 19:16:06.638312 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:06.644317 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:06.644688 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 19:16:06.644849 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 19:16:06.644947 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 19:16:06.645049 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:06.649857 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:06.650294 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 20 19:16:06.651830 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 20 19:16:06.652679 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 20 19:16:06.652858 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 20 19:16:06.653098 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 20 19:16:06.656038 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 20 19:16:06.656748 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 20 19:16:06.660717 systemd[1]: Finished ensure-sysext.service. Mar 20 19:16:06.668558 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 20 19:16:06.686519 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 20 19:16:06.686687 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 20 19:16:06.690933 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 20 19:16:06.701541 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 20 19:16:06.701723 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 20 19:16:06.702714 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 20 19:16:06.704145 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 20 19:16:06.709462 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 20 19:16:06.710377 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 20 19:16:06.710550 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 20 19:16:06.711717 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 20 19:16:06.712303 systemd-udevd[1349]: Using default interface naming scheme 'v255'. Mar 20 19:16:06.737428 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 20 19:16:06.738458 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 20 19:16:06.740655 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 20 19:16:06.744627 augenrules[1380]: No rules Mar 20 19:16:06.745695 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 19:16:06.745895 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 19:16:06.756454 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 20 19:16:06.759794 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 20 19:16:06.760408 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 20 19:16:06.862584 systemd-resolved[1343]: Positive Trust Anchors: Mar 20 19:16:06.862602 systemd-resolved[1343]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 20 19:16:06.862644 systemd-resolved[1343]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 20 19:16:06.870859 systemd-resolved[1343]: Using system hostname 'ci-9999-0-1-1-f6fba67404.novalocal'. Mar 20 19:16:06.872105 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 20 19:16:06.873718 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 20 19:16:06.896970 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 20 19:16:06.897442 systemd-networkd[1393]: lo: Link UP Mar 20 19:16:06.897450 systemd-networkd[1393]: lo: Gained carrier Mar 20 19:16:06.899142 systemd-networkd[1393]: Enumeration completed Mar 20 19:16:06.899223 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 20 19:16:06.900296 systemd[1]: Reached target network.target - Network. Mar 20 19:16:06.903528 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 20 19:16:06.904957 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 20 19:16:06.906564 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 20 19:16:06.907127 systemd[1]: Reached target time-set.target - System Time Set. Mar 20 19:16:06.933678 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1404) Mar 20 19:16:06.942603 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 20 19:16:06.979056 systemd-networkd[1393]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 19:16:06.979066 systemd-networkd[1393]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 20 19:16:06.981636 systemd-networkd[1393]: eth0: Link UP Mar 20 19:16:06.981644 systemd-networkd[1393]: eth0: Gained carrier Mar 20 19:16:06.981658 systemd-networkd[1393]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 20 19:16:06.997495 systemd-networkd[1393]: eth0: DHCPv4 address 172.24.4.12/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 20 19:16:06.998092 systemd-timesyncd[1359]: Network configuration changed, trying to establish connection. Mar 20 19:16:07.001049 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 20 19:16:07.002435 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 20 19:16:07.004252 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 20 19:16:07.010407 kernel: ACPI: button: Power Button [PWRF] Mar 20 19:16:07.035821 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 20 19:16:07.050394 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 20 19:16:07.065369 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Mar 20 19:16:07.082371 kernel: mousedev: PS/2 mouse device common for all mice Mar 20 19:16:07.084243 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 19:16:07.101378 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Mar 20 19:16:07.105321 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Mar 20 19:16:07.106364 kernel: Console: switching to colour dummy device 80x25 Mar 20 19:16:07.106801 systemd-timesyncd[1359]: Contacted time server 45.55.58.103:123 (0.flatcar.pool.ntp.org). Mar 20 19:16:07.106857 systemd-timesyncd[1359]: Initial clock synchronization to Thu 2025-03-20 19:16:07.303373 UTC. Mar 20 19:16:07.107996 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 20 19:16:07.108032 kernel: [drm] features: -context_init Mar 20 19:16:07.109367 kernel: [drm] number of scanouts: 1 Mar 20 19:16:07.109394 kernel: [drm] number of cap sets: 0 Mar 20 19:16:07.110377 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Mar 20 19:16:07.119658 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 20 19:16:07.119727 kernel: Console: switching to colour frame buffer device 160x50 Mar 20 19:16:07.125406 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 20 19:16:07.128198 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 19:16:07.129191 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:16:07.136155 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 19:16:07.141137 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 20 19:16:07.141323 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:16:07.144102 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 20 19:16:07.145776 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 20 19:16:07.153749 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 20 19:16:07.174979 lvm[1439]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 19:16:07.212994 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 20 19:16:07.213177 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 20 19:16:07.214549 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 20 19:16:07.228189 lvm[1443]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 20 19:16:07.252444 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 20 19:16:07.255093 systemd[1]: Reached target sysinit.target - System Initialization. Mar 20 19:16:07.255581 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 20 19:16:07.255821 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 20 19:16:07.256319 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 20 19:16:07.256715 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 20 19:16:07.256901 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 20 19:16:07.257058 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 20 19:16:07.257136 systemd[1]: Reached target paths.target - Path Units. Mar 20 19:16:07.257270 systemd[1]: Reached target timers.target - Timer Units. Mar 20 19:16:07.259899 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 20 19:16:07.261193 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 20 19:16:07.264039 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 20 19:16:07.265340 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 20 19:16:07.265600 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 20 19:16:07.272022 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 20 19:16:07.273685 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 20 19:16:07.276761 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 20 19:16:07.280215 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 20 19:16:07.284757 systemd[1]: Reached target sockets.target - Socket Units. Mar 20 19:16:07.286599 systemd[1]: Reached target basic.target - Basic System. Mar 20 19:16:07.288858 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 20 19:16:07.288941 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 20 19:16:07.292110 systemd[1]: Starting containerd.service - containerd container runtime... Mar 20 19:16:07.307043 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 20 19:16:07.319652 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 20 19:16:07.331544 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 20 19:16:07.345676 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 20 19:16:07.346272 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 20 19:16:07.350634 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 20 19:16:07.353636 jq[1455]: false Mar 20 19:16:07.355673 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 20 19:16:07.363450 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 20 19:16:07.369493 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 20 19:16:07.380256 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 20 19:16:07.381826 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 20 19:16:07.385854 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 20 19:16:07.389218 systemd[1]: Starting update-engine.service - Update Engine... Mar 20 19:16:07.394941 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 20 19:16:07.400571 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 20 19:16:07.401301 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 20 19:16:07.401581 systemd[1]: motdgen.service: Deactivated successfully. Mar 20 19:16:07.401766 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 20 19:16:07.411123 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 20 19:16:07.422999 jq[1471]: true Mar 20 19:16:07.411342 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 20 19:16:07.428747 dbus-daemon[1452]: [system] SELinux support is enabled Mar 20 19:16:07.431499 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 20 19:16:07.452800 update_engine[1469]: I20250320 19:16:07.440220 1469 main.cc:92] Flatcar Update Engine starting Mar 20 19:16:07.453045 extend-filesystems[1456]: Found loop4 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found loop5 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found loop6 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found loop7 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda1 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda2 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda3 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found usr Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda4 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda6 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda7 Mar 20 19:16:07.453045 extend-filesystems[1456]: Found vda9 Mar 20 19:16:07.453045 extend-filesystems[1456]: Checking size of /dev/vda9 Mar 20 19:16:07.531330 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1406) Mar 20 19:16:07.441830 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 20 19:16:07.531472 tar[1474]: linux-amd64/helm Mar 20 19:16:07.534939 extend-filesystems[1456]: Resized partition /dev/vda9 Mar 20 19:16:07.541885 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Mar 20 19:16:07.542009 update_engine[1469]: I20250320 19:16:07.475747 1469 update_check_scheduler.cc:74] Next update check in 2m1s Mar 20 19:16:07.441872 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 20 19:16:07.542161 extend-filesystems[1492]: resize2fs 1.47.2 (1-Jan-2025) Mar 20 19:16:07.542914 jq[1480]: true Mar 20 19:16:07.447766 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 20 19:16:07.447789 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 20 19:16:07.475704 systemd[1]: Started update-engine.service - Update Engine. Mar 20 19:16:07.486675 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 20 19:16:07.495147 (ntainerd)[1481]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 20 19:16:07.574101 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Mar 20 19:16:07.621220 extend-filesystems[1492]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 20 19:16:07.621220 extend-filesystems[1492]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 20 19:16:07.621220 extend-filesystems[1492]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Mar 20 19:16:07.636643 extend-filesystems[1456]: Resized filesystem in /dev/vda9 Mar 20 19:16:07.631847 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 20 19:16:07.632050 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 20 19:16:07.672138 bash[1508]: Updated "/home/core/.ssh/authorized_keys" Mar 20 19:16:07.673155 systemd-logind[1464]: New seat seat0. Mar 20 19:16:07.673199 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 20 19:16:07.683497 systemd[1]: Starting sshkeys.service... Mar 20 19:16:07.688823 systemd-logind[1464]: Watching system buttons on /dev/input/event1 (Power Button) Mar 20 19:16:07.688844 systemd-logind[1464]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 20 19:16:07.690075 systemd[1]: Started systemd-logind.service - User Login Management. Mar 20 19:16:07.729729 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 20 19:16:07.738629 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 20 19:16:07.758276 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 20 19:16:07.827307 sshd_keygen[1475]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 20 19:16:07.858461 locksmithd[1488]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 20 19:16:07.865747 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 20 19:16:07.875646 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 20 19:16:07.878785 systemd[1]: Started sshd@0-172.24.4.12:22-172.24.4.1:57382.service - OpenSSH per-connection server daemon (172.24.4.1:57382). Mar 20 19:16:07.901722 systemd[1]: issuegen.service: Deactivated successfully. Mar 20 19:16:07.901905 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 20 19:16:07.912566 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 20 19:16:07.943833 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 20 19:16:07.951704 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 20 19:16:07.961156 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 20 19:16:07.962216 systemd[1]: Reached target getty.target - Login Prompts. Mar 20 19:16:07.986405 containerd[1481]: time="2025-03-20T19:16:07Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 20 19:16:07.989407 containerd[1481]: time="2025-03-20T19:16:07.987146479Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 20 19:16:08.000481 containerd[1481]: time="2025-03-20T19:16:08.000446336Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.199µs" Mar 20 19:16:08.000592 containerd[1481]: time="2025-03-20T19:16:08.000573562Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 20 19:16:08.000657 containerd[1481]: time="2025-03-20T19:16:08.000642430Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 20 19:16:08.000860 containerd[1481]: time="2025-03-20T19:16:08.000841203Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 20 19:16:08.000926 containerd[1481]: time="2025-03-20T19:16:08.000912299Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 20 19:16:08.001011 containerd[1481]: time="2025-03-20T19:16:08.000995487Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001140 containerd[1481]: time="2025-03-20T19:16:08.001119335Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001205 containerd[1481]: time="2025-03-20T19:16:08.001189989Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001493 containerd[1481]: time="2025-03-20T19:16:08.001469806Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001554 containerd[1481]: time="2025-03-20T19:16:08.001540932Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001612 containerd[1481]: time="2025-03-20T19:16:08.001597328Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001710 containerd[1481]: time="2025-03-20T19:16:08.001694467Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 20 19:16:08.001843 containerd[1481]: time="2025-03-20T19:16:08.001824433Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 20 19:16:08.002121 containerd[1481]: time="2025-03-20T19:16:08.002101128Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 19:16:08.002209 containerd[1481]: time="2025-03-20T19:16:08.002191963Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 20 19:16:08.002265 containerd[1481]: time="2025-03-20T19:16:08.002252333Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 20 19:16:08.002351 containerd[1481]: time="2025-03-20T19:16:08.002335223Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 20 19:16:08.002667 containerd[1481]: time="2025-03-20T19:16:08.002647979Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 20 19:16:08.002780 containerd[1481]: time="2025-03-20T19:16:08.002763585Z" level=info msg="metadata content store policy set" policy=shared Mar 20 19:16:08.012786 containerd[1481]: time="2025-03-20T19:16:08.012753627Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 20 19:16:08.012887 containerd[1481]: time="2025-03-20T19:16:08.012871356Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 20 19:16:08.012997 containerd[1481]: time="2025-03-20T19:16:08.012979992Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 20 19:16:08.013062 containerd[1481]: time="2025-03-20T19:16:08.013047207Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 20 19:16:08.013126 containerd[1481]: time="2025-03-20T19:16:08.013112494Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 20 19:16:08.013191 containerd[1481]: time="2025-03-20T19:16:08.013176496Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 20 19:16:08.013268 containerd[1481]: time="2025-03-20T19:16:08.013252345Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 20 19:16:08.013330 containerd[1481]: time="2025-03-20T19:16:08.013316768Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 20 19:16:08.013407 containerd[1481]: time="2025-03-20T19:16:08.013392709Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 20 19:16:08.013466 containerd[1481]: time="2025-03-20T19:16:08.013452708Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 20 19:16:08.013522 containerd[1481]: time="2025-03-20T19:16:08.013508109Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 20 19:16:08.013579 containerd[1481]: time="2025-03-20T19:16:08.013566363Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 20 19:16:08.013723 containerd[1481]: time="2025-03-20T19:16:08.013705537Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 20 19:16:08.013806 containerd[1481]: time="2025-03-20T19:16:08.013790563Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 20 19:16:08.013867 containerd[1481]: time="2025-03-20T19:16:08.013853498Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 20 19:16:08.013931 containerd[1481]: time="2025-03-20T19:16:08.013917562Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 20 19:16:08.013989 containerd[1481]: time="2025-03-20T19:16:08.013976155Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 20 19:16:08.014052 containerd[1481]: time="2025-03-20T19:16:08.014037776Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 20 19:16:08.014125 containerd[1481]: time="2025-03-20T19:16:08.014110125Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 20 19:16:08.014192 containerd[1481]: time="2025-03-20T19:16:08.014177423Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 20 19:16:08.014250 containerd[1481]: time="2025-03-20T19:16:08.014237648Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 20 19:16:08.014331 containerd[1481]: time="2025-03-20T19:16:08.014316587Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 20 19:16:08.014414 containerd[1481]: time="2025-03-20T19:16:08.014399014Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 20 19:16:08.014538 containerd[1481]: time="2025-03-20T19:16:08.014520964Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 20 19:16:08.014601 containerd[1481]: time="2025-03-20T19:16:08.014588180Z" level=info msg="Start snapshots syncer" Mar 20 19:16:08.014677 containerd[1481]: time="2025-03-20T19:16:08.014662017Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 20 19:16:08.015052 containerd[1481]: time="2025-03-20T19:16:08.015009931Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 20 19:16:08.015229 containerd[1481]: time="2025-03-20T19:16:08.015210756Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 20 19:16:08.015349 containerd[1481]: time="2025-03-20T19:16:08.015332890Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 20 19:16:08.015538 containerd[1481]: time="2025-03-20T19:16:08.015519674Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 20 19:16:08.015610 containerd[1481]: time="2025-03-20T19:16:08.015596282Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 20 19:16:08.015670 containerd[1481]: time="2025-03-20T19:16:08.015656548Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 20 19:16:08.015728 containerd[1481]: time="2025-03-20T19:16:08.015714638Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 20 19:16:08.015788 containerd[1481]: time="2025-03-20T19:16:08.015774710Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 20 19:16:08.015852 containerd[1481]: time="2025-03-20T19:16:08.015838733Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 20 19:16:08.015919 containerd[1481]: time="2025-03-20T19:16:08.015904922Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 20 19:16:08.015991 containerd[1481]: time="2025-03-20T19:16:08.015976541Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 20 19:16:08.016052 containerd[1481]: time="2025-03-20T19:16:08.016038193Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 20 19:16:08.016109 containerd[1481]: time="2025-03-20T19:16:08.016096109Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 20 19:16:08.016189 containerd[1481]: time="2025-03-20T19:16:08.016174709Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 19:16:08.016284 containerd[1481]: time="2025-03-20T19:16:08.016267115Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 20 19:16:08.016345 containerd[1481]: time="2025-03-20T19:16:08.016331723Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 19:16:08.016439 containerd[1481]: time="2025-03-20T19:16:08.016421327Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 20 19:16:08.016494 containerd[1481]: time="2025-03-20T19:16:08.016481367Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 20 19:16:08.016551 containerd[1481]: time="2025-03-20T19:16:08.016536080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 20 19:16:08.016609 containerd[1481]: time="2025-03-20T19:16:08.016595833Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 20 19:16:08.016671 containerd[1481]: time="2025-03-20T19:16:08.016657629Z" level=info msg="runtime interface created" Mar 20 19:16:08.016719 containerd[1481]: time="2025-03-20T19:16:08.016708194Z" level=info msg="created NRI interface" Mar 20 19:16:08.016785 containerd[1481]: time="2025-03-20T19:16:08.016770052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 20 19:16:08.016847 containerd[1481]: time="2025-03-20T19:16:08.016835071Z" level=info msg="Connect containerd service" Mar 20 19:16:08.016932 containerd[1481]: time="2025-03-20T19:16:08.016917654Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 20 19:16:08.017703 containerd[1481]: time="2025-03-20T19:16:08.017681375Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181647249Z" level=info msg="Start subscribing containerd event" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181704220Z" level=info msg="Start recovering state" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181787448Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181819435Z" level=info msg="Start event monitor" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181839954Z" level=info msg="Start cni network conf syncer for default" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181847417Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181848957Z" level=info msg="Start streaming server" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181886640Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181895663Z" level=info msg="runtime interface starting up..." Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181904388Z" level=info msg="starting plugins..." Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.181919683Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 20 19:16:08.185277 containerd[1481]: time="2025-03-20T19:16:08.182058016Z" level=info msg="containerd successfully booted in 0.196013s" Mar 20 19:16:08.182164 systemd[1]: Started containerd.service - containerd container runtime. Mar 20 19:16:08.253648 tar[1474]: linux-amd64/LICENSE Mar 20 19:16:08.253759 tar[1474]: linux-amd64/README.md Mar 20 19:16:08.272502 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 20 19:16:08.349718 systemd-networkd[1393]: eth0: Gained IPv6LL Mar 20 19:16:08.353474 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 20 19:16:08.363093 systemd[1]: Reached target network-online.target - Network is Online. Mar 20 19:16:08.371673 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:08.382222 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 20 19:16:08.449931 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 20 19:16:08.847123 sshd[1534]: Accepted publickey for core from 172.24.4.1 port 57382 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:08.851538 sshd-session[1534]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:08.887093 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 20 19:16:08.887662 systemd-logind[1464]: New session 1 of user core. Mar 20 19:16:08.893672 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 20 19:16:08.922299 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 20 19:16:08.931689 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 20 19:16:08.945940 (systemd)[1576]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 20 19:16:08.953952 systemd-logind[1464]: New session c1 of user core. Mar 20 19:16:09.121481 systemd[1576]: Queued start job for default target default.target. Mar 20 19:16:09.129619 systemd[1576]: Created slice app.slice - User Application Slice. Mar 20 19:16:09.129734 systemd[1576]: Reached target paths.target - Paths. Mar 20 19:16:09.129906 systemd[1576]: Reached target timers.target - Timers. Mar 20 19:16:09.133442 systemd[1576]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 20 19:16:09.142496 systemd[1576]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 20 19:16:09.143469 systemd[1576]: Reached target sockets.target - Sockets. Mar 20 19:16:09.143677 systemd[1576]: Reached target basic.target - Basic System. Mar 20 19:16:09.143811 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 20 19:16:09.144074 systemd[1576]: Reached target default.target - Main User Target. Mar 20 19:16:09.144100 systemd[1576]: Startup finished in 182ms. Mar 20 19:16:09.158586 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 20 19:16:09.669348 systemd[1]: Started sshd@1-172.24.4.12:22-172.24.4.1:57384.service - OpenSSH per-connection server daemon (172.24.4.1:57384). Mar 20 19:16:10.057226 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:10.081479 (kubelet)[1593]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 19:16:11.410610 kubelet[1593]: E0320 19:16:11.410535 1593 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 19:16:11.414888 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 19:16:11.415215 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 19:16:11.415869 systemd[1]: kubelet.service: Consumed 1.953s CPU time, 247.4M memory peak. Mar 20 19:16:11.439469 sshd[1587]: Accepted publickey for core from 172.24.4.1 port 57384 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:11.441863 sshd-session[1587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:11.455636 systemd-logind[1464]: New session 2 of user core. Mar 20 19:16:11.467970 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 20 19:16:12.205313 sshd[1604]: Connection closed by 172.24.4.1 port 57384 Mar 20 19:16:12.206414 sshd-session[1587]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:12.229988 systemd[1]: sshd@1-172.24.4.12:22-172.24.4.1:57384.service: Deactivated successfully. Mar 20 19:16:12.233642 systemd[1]: session-2.scope: Deactivated successfully. Mar 20 19:16:12.237766 systemd-logind[1464]: Session 2 logged out. Waiting for processes to exit. Mar 20 19:16:12.240220 systemd[1]: Started sshd@2-172.24.4.12:22-172.24.4.1:57396.service - OpenSSH per-connection server daemon (172.24.4.1:57396). Mar 20 19:16:12.249401 systemd-logind[1464]: Removed session 2. Mar 20 19:16:13.033781 login[1541]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 19:16:13.039910 login[1542]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 20 19:16:13.045126 systemd-logind[1464]: New session 4 of user core. Mar 20 19:16:13.052810 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 20 19:16:13.060279 systemd-logind[1464]: New session 3 of user core. Mar 20 19:16:13.070795 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 20 19:16:13.424950 sshd[1609]: Accepted publickey for core from 172.24.4.1 port 57396 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:13.427636 sshd-session[1609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:13.437577 systemd-logind[1464]: New session 5 of user core. Mar 20 19:16:13.446889 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 20 19:16:14.178526 sshd[1638]: Connection closed by 172.24.4.1 port 57396 Mar 20 19:16:14.179556 sshd-session[1609]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:14.186671 systemd[1]: sshd@2-172.24.4.12:22-172.24.4.1:57396.service: Deactivated successfully. Mar 20 19:16:14.191232 systemd[1]: session-5.scope: Deactivated successfully. Mar 20 19:16:14.193314 systemd-logind[1464]: Session 5 logged out. Waiting for processes to exit. Mar 20 19:16:14.196076 systemd-logind[1464]: Removed session 5. Mar 20 19:16:14.396276 coreos-metadata[1451]: Mar 20 19:16:14.396 WARN failed to locate config-drive, using the metadata service API instead Mar 20 19:16:14.446906 coreos-metadata[1451]: Mar 20 19:16:14.446 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 20 19:16:14.642033 coreos-metadata[1451]: Mar 20 19:16:14.641 INFO Fetch successful Mar 20 19:16:14.642033 coreos-metadata[1451]: Mar 20 19:16:14.641 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 20 19:16:14.655558 coreos-metadata[1451]: Mar 20 19:16:14.655 INFO Fetch successful Mar 20 19:16:14.655558 coreos-metadata[1451]: Mar 20 19:16:14.655 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 20 19:16:14.668735 coreos-metadata[1451]: Mar 20 19:16:14.668 INFO Fetch successful Mar 20 19:16:14.668735 coreos-metadata[1451]: Mar 20 19:16:14.668 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 20 19:16:14.683026 coreos-metadata[1451]: Mar 20 19:16:14.682 INFO Fetch successful Mar 20 19:16:14.683026 coreos-metadata[1451]: Mar 20 19:16:14.682 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 20 19:16:14.696495 coreos-metadata[1451]: Mar 20 19:16:14.696 INFO Fetch successful Mar 20 19:16:14.696495 coreos-metadata[1451]: Mar 20 19:16:14.696 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 20 19:16:14.710480 coreos-metadata[1451]: Mar 20 19:16:14.710 INFO Fetch successful Mar 20 19:16:14.758481 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 20 19:16:14.760259 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 20 19:16:14.876557 coreos-metadata[1516]: Mar 20 19:16:14.876 WARN failed to locate config-drive, using the metadata service API instead Mar 20 19:16:14.919296 coreos-metadata[1516]: Mar 20 19:16:14.919 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 20 19:16:14.934209 coreos-metadata[1516]: Mar 20 19:16:14.934 INFO Fetch successful Mar 20 19:16:14.934209 coreos-metadata[1516]: Mar 20 19:16:14.934 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 20 19:16:14.947584 coreos-metadata[1516]: Mar 20 19:16:14.947 INFO Fetch successful Mar 20 19:16:14.953223 unknown[1516]: wrote ssh authorized keys file for user: core Mar 20 19:16:14.997619 update-ssh-keys[1653]: Updated "/home/core/.ssh/authorized_keys" Mar 20 19:16:14.998806 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 20 19:16:15.002904 systemd[1]: Finished sshkeys.service. Mar 20 19:16:15.007064 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 20 19:16:15.007549 systemd[1]: Startup finished in 1.233s (kernel) + 15.406s (initrd) + 10.846s (userspace) = 27.486s. Mar 20 19:16:21.619531 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 20 19:16:21.622734 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:21.950438 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:21.964892 (kubelet)[1664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 19:16:22.054986 kubelet[1664]: E0320 19:16:22.054925 1664 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 19:16:22.062300 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 19:16:22.062638 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 19:16:22.063531 systemd[1]: kubelet.service: Consumed 299ms CPU time, 96.9M memory peak. Mar 20 19:16:24.262834 systemd[1]: Started sshd@3-172.24.4.12:22-172.24.4.1:50902.service - OpenSSH per-connection server daemon (172.24.4.1:50902). Mar 20 19:16:25.446421 sshd[1672]: Accepted publickey for core from 172.24.4.1 port 50902 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:25.448893 sshd-session[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:25.459081 systemd-logind[1464]: New session 6 of user core. Mar 20 19:16:25.470683 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 20 19:16:26.242411 sshd[1674]: Connection closed by 172.24.4.1 port 50902 Mar 20 19:16:26.243489 sshd-session[1672]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:26.261267 systemd[1]: sshd@3-172.24.4.12:22-172.24.4.1:50902.service: Deactivated successfully. Mar 20 19:16:26.264722 systemd[1]: session-6.scope: Deactivated successfully. Mar 20 19:16:26.268661 systemd-logind[1464]: Session 6 logged out. Waiting for processes to exit. Mar 20 19:16:26.271277 systemd[1]: Started sshd@4-172.24.4.12:22-172.24.4.1:50914.service - OpenSSH per-connection server daemon (172.24.4.1:50914). Mar 20 19:16:26.274289 systemd-logind[1464]: Removed session 6. Mar 20 19:16:27.448331 sshd[1679]: Accepted publickey for core from 172.24.4.1 port 50914 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:27.451660 sshd-session[1679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:27.466499 systemd-logind[1464]: New session 7 of user core. Mar 20 19:16:27.473755 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 20 19:16:28.205404 sshd[1682]: Connection closed by 172.24.4.1 port 50914 Mar 20 19:16:28.205767 sshd-session[1679]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:28.224963 systemd[1]: sshd@4-172.24.4.12:22-172.24.4.1:50914.service: Deactivated successfully. Mar 20 19:16:28.228959 systemd[1]: session-7.scope: Deactivated successfully. Mar 20 19:16:28.232985 systemd-logind[1464]: Session 7 logged out. Waiting for processes to exit. Mar 20 19:16:28.236675 systemd[1]: Started sshd@5-172.24.4.12:22-172.24.4.1:50924.service - OpenSSH per-connection server daemon (172.24.4.1:50924). Mar 20 19:16:28.241198 systemd-logind[1464]: Removed session 7. Mar 20 19:16:29.408598 sshd[1687]: Accepted publickey for core from 172.24.4.1 port 50924 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:29.411171 sshd-session[1687]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:29.422922 systemd-logind[1464]: New session 8 of user core. Mar 20 19:16:29.433664 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 20 19:16:30.165596 sshd[1690]: Connection closed by 172.24.4.1 port 50924 Mar 20 19:16:30.165207 sshd-session[1687]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:30.183400 systemd[1]: sshd@5-172.24.4.12:22-172.24.4.1:50924.service: Deactivated successfully. Mar 20 19:16:30.186659 systemd[1]: session-8.scope: Deactivated successfully. Mar 20 19:16:30.188644 systemd-logind[1464]: Session 8 logged out. Waiting for processes to exit. Mar 20 19:16:30.193767 systemd[1]: Started sshd@6-172.24.4.12:22-172.24.4.1:50940.service - OpenSSH per-connection server daemon (172.24.4.1:50940). Mar 20 19:16:30.197542 systemd-logind[1464]: Removed session 8. Mar 20 19:16:31.367223 sshd[1695]: Accepted publickey for core from 172.24.4.1 port 50940 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:31.369880 sshd-session[1695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:31.381382 systemd-logind[1464]: New session 9 of user core. Mar 20 19:16:31.392653 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 20 19:16:31.874584 sudo[1699]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 20 19:16:31.875854 sudo[1699]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 19:16:31.897431 sudo[1699]: pam_unix(sudo:session): session closed for user root Mar 20 19:16:32.118091 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 20 19:16:32.121913 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:32.126545 sshd[1698]: Connection closed by 172.24.4.1 port 50940 Mar 20 19:16:32.124772 sshd-session[1695]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:32.142201 systemd[1]: sshd@6-172.24.4.12:22-172.24.4.1:50940.service: Deactivated successfully. Mar 20 19:16:32.147195 systemd[1]: session-9.scope: Deactivated successfully. Mar 20 19:16:32.150821 systemd-logind[1464]: Session 9 logged out. Waiting for processes to exit. Mar 20 19:16:32.156850 systemd[1]: Started sshd@7-172.24.4.12:22-172.24.4.1:50946.service - OpenSSH per-connection server daemon (172.24.4.1:50946). Mar 20 19:16:32.158902 systemd-logind[1464]: Removed session 9. Mar 20 19:16:32.433167 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:32.447877 (kubelet)[1715]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 19:16:32.550210 kubelet[1715]: E0320 19:16:32.550108 1715 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 19:16:32.554874 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 19:16:32.555190 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 19:16:32.556197 systemd[1]: kubelet.service: Consumed 287ms CPU time, 95.4M memory peak. Mar 20 19:16:33.370576 sshd[1707]: Accepted publickey for core from 172.24.4.1 port 50946 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:33.373161 sshd-session[1707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:33.385923 systemd-logind[1464]: New session 10 of user core. Mar 20 19:16:33.395652 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 20 19:16:33.854682 sudo[1725]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 20 19:16:33.856335 sudo[1725]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 19:16:33.863589 sudo[1725]: pam_unix(sudo:session): session closed for user root Mar 20 19:16:33.875423 sudo[1724]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 20 19:16:33.876063 sudo[1724]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 19:16:33.896746 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 20 19:16:33.972999 augenrules[1747]: No rules Mar 20 19:16:33.974708 systemd[1]: audit-rules.service: Deactivated successfully. Mar 20 19:16:33.975146 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 20 19:16:33.977771 sudo[1724]: pam_unix(sudo:session): session closed for user root Mar 20 19:16:34.126073 sshd[1723]: Connection closed by 172.24.4.1 port 50946 Mar 20 19:16:34.127727 sshd-session[1707]: pam_unix(sshd:session): session closed for user core Mar 20 19:16:34.146189 systemd[1]: sshd@7-172.24.4.12:22-172.24.4.1:50946.service: Deactivated successfully. Mar 20 19:16:34.149637 systemd[1]: session-10.scope: Deactivated successfully. Mar 20 19:16:34.151290 systemd-logind[1464]: Session 10 logged out. Waiting for processes to exit. Mar 20 19:16:34.155819 systemd[1]: Started sshd@8-172.24.4.12:22-172.24.4.1:55710.service - OpenSSH per-connection server daemon (172.24.4.1:55710). Mar 20 19:16:34.158413 systemd-logind[1464]: Removed session 10. Mar 20 19:16:35.382803 sshd[1755]: Accepted publickey for core from 172.24.4.1 port 55710 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:16:35.385400 sshd-session[1755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:16:35.396952 systemd-logind[1464]: New session 11 of user core. Mar 20 19:16:35.407662 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 20 19:16:35.866085 sudo[1759]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 20 19:16:35.866784 sudo[1759]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 20 19:16:36.568829 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 20 19:16:36.586092 (dockerd)[1777]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 20 19:16:37.144480 dockerd[1777]: time="2025-03-20T19:16:37.144400286Z" level=info msg="Starting up" Mar 20 19:16:37.145857 dockerd[1777]: time="2025-03-20T19:16:37.145804596Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 20 19:16:37.241459 dockerd[1777]: time="2025-03-20T19:16:37.241395515Z" level=info msg="Loading containers: start." Mar 20 19:16:37.425411 kernel: Initializing XFRM netlink socket Mar 20 19:16:37.553177 systemd-networkd[1393]: docker0: Link UP Mar 20 19:16:37.618189 dockerd[1777]: time="2025-03-20T19:16:37.617711716Z" level=info msg="Loading containers: done." Mar 20 19:16:37.637514 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2006131558-merged.mount: Deactivated successfully. Mar 20 19:16:37.642132 dockerd[1777]: time="2025-03-20T19:16:37.641416903Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 20 19:16:37.642132 dockerd[1777]: time="2025-03-20T19:16:37.641586366Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 20 19:16:37.642132 dockerd[1777]: time="2025-03-20T19:16:37.641791696Z" level=info msg="Daemon has completed initialization" Mar 20 19:16:37.708173 dockerd[1777]: time="2025-03-20T19:16:37.707976329Z" level=info msg="API listen on /run/docker.sock" Mar 20 19:16:37.708573 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 20 19:16:39.355065 containerd[1481]: time="2025-03-20T19:16:39.355031157Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 20 19:16:40.057790 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3480490607.mount: Deactivated successfully. Mar 20 19:16:42.231639 containerd[1481]: time="2025-03-20T19:16:42.231585498Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:42.233223 containerd[1481]: time="2025-03-20T19:16:42.232949611Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674581" Mar 20 19:16:42.234431 containerd[1481]: time="2025-03-20T19:16:42.234399530Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:42.237984 containerd[1481]: time="2025-03-20T19:16:42.237953120Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:42.239309 containerd[1481]: time="2025-03-20T19:16:42.238956810Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 2.883886665s" Mar 20 19:16:42.239309 containerd[1481]: time="2025-03-20T19:16:42.238992137Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 20 19:16:42.257071 containerd[1481]: time="2025-03-20T19:16:42.256876216Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 20 19:16:42.618592 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 20 19:16:42.623465 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:42.804798 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:42.810453 (kubelet)[2053]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 19:16:43.081403 kubelet[2053]: E0320 19:16:43.081178 2053 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 19:16:43.085539 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 19:16:43.085685 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 19:16:43.086162 systemd[1]: kubelet.service: Consumed 254ms CPU time, 98.1M memory peak. Mar 20 19:16:44.945030 containerd[1481]: time="2025-03-20T19:16:44.944057700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:44.945030 containerd[1481]: time="2025-03-20T19:16:44.944986057Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619780" Mar 20 19:16:44.946571 containerd[1481]: time="2025-03-20T19:16:44.946522677Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:44.953470 containerd[1481]: time="2025-03-20T19:16:44.953411885Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:44.954409 containerd[1481]: time="2025-03-20T19:16:44.954383666Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 2.697473336s" Mar 20 19:16:44.954466 containerd[1481]: time="2025-03-20T19:16:44.954414882Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 20 19:16:44.972499 containerd[1481]: time="2025-03-20T19:16:44.972460073Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 20 19:16:46.584256 containerd[1481]: time="2025-03-20T19:16:46.584091891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:46.585589 containerd[1481]: time="2025-03-20T19:16:46.585295113Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903317" Mar 20 19:16:46.586848 containerd[1481]: time="2025-03-20T19:16:46.586799792Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:46.590311 containerd[1481]: time="2025-03-20T19:16:46.590288345Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:46.591682 containerd[1481]: time="2025-03-20T19:16:46.591656937Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.619159995s" Mar 20 19:16:46.591739 containerd[1481]: time="2025-03-20T19:16:46.591688062Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 20 19:16:46.608856 containerd[1481]: time="2025-03-20T19:16:46.608817670Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 20 19:16:48.006825 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1900097821.mount: Deactivated successfully. Mar 20 19:16:48.489592 containerd[1481]: time="2025-03-20T19:16:48.489530275Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:48.491101 containerd[1481]: time="2025-03-20T19:16:48.491056222Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185380" Mar 20 19:16:48.492620 containerd[1481]: time="2025-03-20T19:16:48.492579585Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:48.494705 containerd[1481]: time="2025-03-20T19:16:48.494649067Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:48.495161 containerd[1481]: time="2025-03-20T19:16:48.495131376Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 1.886279874s" Mar 20 19:16:48.495206 containerd[1481]: time="2025-03-20T19:16:48.495161558Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 20 19:16:48.515601 containerd[1481]: time="2025-03-20T19:16:48.515552983Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 20 19:16:49.144328 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3260020080.mount: Deactivated successfully. Mar 20 19:16:50.793923 containerd[1481]: time="2025-03-20T19:16:50.793859714Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:50.795413 containerd[1481]: time="2025-03-20T19:16:50.795368633Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Mar 20 19:16:50.797196 containerd[1481]: time="2025-03-20T19:16:50.797114463Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:50.800100 containerd[1481]: time="2025-03-20T19:16:50.800059342Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:50.801479 containerd[1481]: time="2025-03-20T19:16:50.801344589Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.285665592s" Mar 20 19:16:50.801479 containerd[1481]: time="2025-03-20T19:16:50.801396366Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 20 19:16:50.819167 containerd[1481]: time="2025-03-20T19:16:50.819078383Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 20 19:16:51.364018 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4042244849.mount: Deactivated successfully. Mar 20 19:16:51.373394 containerd[1481]: time="2025-03-20T19:16:51.373121202Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:51.375092 containerd[1481]: time="2025-03-20T19:16:51.374976961Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Mar 20 19:16:51.376550 containerd[1481]: time="2025-03-20T19:16:51.376482248Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:51.381981 containerd[1481]: time="2025-03-20T19:16:51.381871909Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:51.384715 containerd[1481]: time="2025-03-20T19:16:51.383715734Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 564.33257ms" Mar 20 19:16:51.384715 containerd[1481]: time="2025-03-20T19:16:51.383787832Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 20 19:16:51.426705 containerd[1481]: time="2025-03-20T19:16:51.426557714Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 20 19:16:52.057311 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1441267435.mount: Deactivated successfully. Mar 20 19:16:52.538372 update_engine[1469]: I20250320 19:16:52.536377 1469 update_attempter.cc:509] Updating boot flags... Mar 20 19:16:52.581401 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2208) Mar 20 19:16:52.686381 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2206) Mar 20 19:16:53.118632 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 20 19:16:53.125748 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:53.281482 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:53.288768 (kubelet)[2223]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 20 19:16:53.542383 kubelet[2223]: E0320 19:16:53.541495 2223 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 20 19:16:53.546768 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 20 19:16:53.547022 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 20 19:16:53.547543 systemd[1]: kubelet.service: Consumed 231ms CPU time, 97.7M memory peak. Mar 20 19:16:55.120072 containerd[1481]: time="2025-03-20T19:16:55.119967243Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:55.125438 containerd[1481]: time="2025-03-20T19:16:55.125174677Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" Mar 20 19:16:55.131816 containerd[1481]: time="2025-03-20T19:16:55.131742295Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:55.229020 containerd[1481]: time="2025-03-20T19:16:55.228056839Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:16:55.232303 containerd[1481]: time="2025-03-20T19:16:55.231638290Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 3.805036746s" Mar 20 19:16:55.232303 containerd[1481]: time="2025-03-20T19:16:55.231726909Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 20 19:16:58.334271 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:58.334558 systemd[1]: kubelet.service: Consumed 231ms CPU time, 97.7M memory peak. Mar 20 19:16:58.337528 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:58.364477 systemd[1]: Reload requested from client PID 2319 ('systemctl') (unit session-11.scope)... Mar 20 19:16:58.364494 systemd[1]: Reloading... Mar 20 19:16:58.474394 zram_generator::config[2367]: No configuration found. Mar 20 19:16:58.692822 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 19:16:58.809815 systemd[1]: Reloading finished in 445 ms. Mar 20 19:16:58.866398 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 20 19:16:58.866478 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 20 19:16:58.866728 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:58.866768 systemd[1]: kubelet.service: Consumed 121ms CPU time, 83.6M memory peak. Mar 20 19:16:58.868470 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:16:59.018134 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:16:59.024845 (kubelet)[2430]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 19:16:59.074490 kubelet[2430]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 19:16:59.074868 kubelet[2430]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 19:16:59.074868 kubelet[2430]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 19:16:59.075113 kubelet[2430]: I0320 19:16:59.075042 2430 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 19:16:59.917446 kubelet[2430]: I0320 19:16:59.916758 2430 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 20 19:16:59.917446 kubelet[2430]: I0320 19:16:59.916813 2430 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 19:16:59.917446 kubelet[2430]: I0320 19:16:59.917239 2430 server.go:927] "Client rotation is on, will bootstrap in background" Mar 20 19:16:59.947375 kubelet[2430]: I0320 19:16:59.946931 2430 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 19:16:59.950002 kubelet[2430]: E0320 19:16:59.949945 2430 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.12:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:16:59.966200 kubelet[2430]: I0320 19:16:59.966156 2430 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 19:16:59.967393 kubelet[2430]: I0320 19:16:59.966852 2430 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 19:16:59.967393 kubelet[2430]: I0320 19:16:59.966925 2430 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-1-1-f6fba67404.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 20 19:16:59.967393 kubelet[2430]: I0320 19:16:59.967335 2430 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 19:16:59.967819 kubelet[2430]: I0320 19:16:59.967792 2430 container_manager_linux.go:301] "Creating device plugin manager" Mar 20 19:16:59.968159 kubelet[2430]: I0320 19:16:59.968130 2430 state_mem.go:36] "Initialized new in-memory state store" Mar 20 19:16:59.970335 kubelet[2430]: I0320 19:16:59.970303 2430 kubelet.go:400] "Attempting to sync node with API server" Mar 20 19:16:59.970545 kubelet[2430]: I0320 19:16:59.970520 2430 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 19:16:59.971142 kubelet[2430]: W0320 19:16:59.970740 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-1-1-f6fba67404.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:16:59.971142 kubelet[2430]: E0320 19:16:59.970794 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-1-1-f6fba67404.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:16:59.971142 kubelet[2430]: I0320 19:16:59.970800 2430 kubelet.go:312] "Adding apiserver pod source" Mar 20 19:16:59.971142 kubelet[2430]: I0320 19:16:59.970848 2430 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 19:16:59.982255 kubelet[2430]: W0320 19:16:59.982172 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:16:59.982635 kubelet[2430]: E0320 19:16:59.982500 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:16:59.982635 kubelet[2430]: I0320 19:16:59.982624 2430 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 19:16:59.984649 kubelet[2430]: I0320 19:16:59.984607 2430 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 19:16:59.984765 kubelet[2430]: W0320 19:16:59.984668 2430 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 20 19:16:59.985263 kubelet[2430]: I0320 19:16:59.985231 2430 server.go:1264] "Started kubelet" Mar 20 19:16:59.986749 kubelet[2430]: I0320 19:16:59.985682 2430 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 19:16:59.988101 kubelet[2430]: I0320 19:16:59.987813 2430 server.go:455] "Adding debug handlers to kubelet server" Mar 20 19:16:59.991513 kubelet[2430]: I0320 19:16:59.990989 2430 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 19:16:59.991513 kubelet[2430]: I0320 19:16:59.991198 2430 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 19:16:59.991513 kubelet[2430]: E0320 19:16:59.991410 2430 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.12:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.12:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-0-1-1-f6fba67404.novalocal.182e98e61ff80d2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-1-1-f6fba67404.novalocal,UID:ci-9999-0-1-1-f6fba67404.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-1-1-f6fba67404.novalocal,},FirstTimestamp:2025-03-20 19:16:59.985210666 +0000 UTC m=+0.956786515,LastTimestamp:2025-03-20 19:16:59.985210666 +0000 UTC m=+0.956786515,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-1-1-f6fba67404.novalocal,}" Mar 20 19:16:59.992872 kubelet[2430]: I0320 19:16:59.992829 2430 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 19:16:59.999895 kubelet[2430]: I0320 19:16:59.998944 2430 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 20 19:17:00.001631 kubelet[2430]: I0320 19:17:00.001617 2430 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 20 19:17:00.001749 kubelet[2430]: I0320 19:17:00.001739 2430 reconciler.go:26] "Reconciler: start to sync state" Mar 20 19:17:00.002459 kubelet[2430]: W0320 19:17:00.002419 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.002675 kubelet[2430]: E0320 19:17:00.002661 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.003411 kubelet[2430]: E0320 19:17:00.003386 2430 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-1-1-f6fba67404.novalocal?timeout=10s\": dial tcp 172.24.4.12:6443: connect: connection refused" interval="200ms" Mar 20 19:17:00.003920 kubelet[2430]: I0320 19:17:00.003903 2430 factory.go:221] Registration of the systemd container factory successfully Mar 20 19:17:00.004160 kubelet[2430]: I0320 19:17:00.004143 2430 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 19:17:00.006301 kubelet[2430]: I0320 19:17:00.006284 2430 factory.go:221] Registration of the containerd container factory successfully Mar 20 19:17:00.014911 kubelet[2430]: E0320 19:17:00.014886 2430 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 19:17:00.030544 kubelet[2430]: I0320 19:17:00.030455 2430 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 19:17:00.031608 kubelet[2430]: I0320 19:17:00.031564 2430 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 19:17:00.031668 kubelet[2430]: I0320 19:17:00.031622 2430 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 19:17:00.031668 kubelet[2430]: I0320 19:17:00.031661 2430 kubelet.go:2337] "Starting kubelet main sync loop" Mar 20 19:17:00.031771 kubelet[2430]: E0320 19:17:00.031734 2430 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 19:17:00.042186 kubelet[2430]: W0320 19:17:00.042159 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.042442 kubelet[2430]: E0320 19:17:00.042255 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.044404 kubelet[2430]: I0320 19:17:00.044076 2430 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 19:17:00.044404 kubelet[2430]: I0320 19:17:00.044114 2430 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 19:17:00.044404 kubelet[2430]: I0320 19:17:00.044148 2430 state_mem.go:36] "Initialized new in-memory state store" Mar 20 19:17:00.048447 kubelet[2430]: I0320 19:17:00.048403 2430 policy_none.go:49] "None policy: Start" Mar 20 19:17:00.049227 kubelet[2430]: I0320 19:17:00.049141 2430 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 19:17:00.049227 kubelet[2430]: I0320 19:17:00.049184 2430 state_mem.go:35] "Initializing new in-memory state store" Mar 20 19:17:00.058856 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 20 19:17:00.073292 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 20 19:17:00.078987 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 20 19:17:00.090174 kubelet[2430]: I0320 19:17:00.089628 2430 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 19:17:00.090174 kubelet[2430]: I0320 19:17:00.089773 2430 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 19:17:00.090174 kubelet[2430]: I0320 19:17:00.089875 2430 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 19:17:00.091775 kubelet[2430]: E0320 19:17:00.091760 2430 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-0-1-1-f6fba67404.novalocal\" not found" Mar 20 19:17:00.102029 kubelet[2430]: I0320 19:17:00.101934 2430 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.102734 kubelet[2430]: E0320 19:17:00.102660 2430 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.12:6443/api/v1/nodes\": dial tcp 172.24.4.12:6443: connect: connection refused" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.132003 kubelet[2430]: I0320 19:17:00.131929 2430 topology_manager.go:215] "Topology Admit Handler" podUID="8e49caf03ba664b69c58839274b024a7" podNamespace="kube-system" podName="kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.135517 kubelet[2430]: I0320 19:17:00.134937 2430 topology_manager.go:215] "Topology Admit Handler" podUID="6ef9ebbd3cf996bdddaa0133ce57dc22" podNamespace="kube-system" podName="kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.138314 kubelet[2430]: I0320 19:17:00.137891 2430 topology_manager.go:215] "Topology Admit Handler" podUID="c7240b17fe9676a09c6232c2f4d21fed" podNamespace="kube-system" podName="kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.153919 systemd[1]: Created slice kubepods-burstable-pod8e49caf03ba664b69c58839274b024a7.slice - libcontainer container kubepods-burstable-pod8e49caf03ba664b69c58839274b024a7.slice. Mar 20 19:17:00.171778 systemd[1]: Created slice kubepods-burstable-pod6ef9ebbd3cf996bdddaa0133ce57dc22.slice - libcontainer container kubepods-burstable-pod6ef9ebbd3cf996bdddaa0133ce57dc22.slice. Mar 20 19:17:00.191648 systemd[1]: Created slice kubepods-burstable-podc7240b17fe9676a09c6232c2f4d21fed.slice - libcontainer container kubepods-burstable-podc7240b17fe9676a09c6232c2f4d21fed.slice. Mar 20 19:17:00.204647 kubelet[2430]: E0320 19:17:00.204559 2430 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-1-1-f6fba67404.novalocal?timeout=10s\": dial tcp 172.24.4.12:6443: connect: connection refused" interval="400ms" Mar 20 19:17:00.303848 kubelet[2430]: I0320 19:17:00.303488 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.303848 kubelet[2430]: I0320 19:17:00.303588 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8e49caf03ba664b69c58839274b024a7-kubeconfig\") pod \"kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"8e49caf03ba664b69c58839274b024a7\") " pod="kube-system/kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.303848 kubelet[2430]: I0320 19:17:00.303636 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6ef9ebbd3cf996bdddaa0133ce57dc22-ca-certs\") pod \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"6ef9ebbd3cf996bdddaa0133ce57dc22\") " pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.303848 kubelet[2430]: I0320 19:17:00.303698 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6ef9ebbd3cf996bdddaa0133ce57dc22-k8s-certs\") pod \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"6ef9ebbd3cf996bdddaa0133ce57dc22\") " pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.303848 kubelet[2430]: I0320 19:17:00.303753 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6ef9ebbd3cf996bdddaa0133ce57dc22-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"6ef9ebbd3cf996bdddaa0133ce57dc22\") " pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.304315 kubelet[2430]: I0320 19:17:00.303800 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-ca-certs\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.304315 kubelet[2430]: I0320 19:17:00.303845 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.304315 kubelet[2430]: I0320 19:17:00.303900 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.304315 kubelet[2430]: I0320 19:17:00.303950 2430 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.307283 kubelet[2430]: I0320 19:17:00.306614 2430 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.307283 kubelet[2430]: E0320 19:17:00.307197 2430 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.12:6443/api/v1/nodes\": dial tcp 172.24.4.12:6443: connect: connection refused" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.468468 containerd[1481]: time="2025-03-20T19:17:00.467639910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal,Uid:8e49caf03ba664b69c58839274b024a7,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:00.487491 containerd[1481]: time="2025-03-20T19:17:00.487304567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal,Uid:6ef9ebbd3cf996bdddaa0133ce57dc22,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:00.498374 containerd[1481]: time="2025-03-20T19:17:00.497935823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal,Uid:c7240b17fe9676a09c6232c2f4d21fed,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:00.606335 kubelet[2430]: E0320 19:17:00.606218 2430 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-1-1-f6fba67404.novalocal?timeout=10s\": dial tcp 172.24.4.12:6443: connect: connection refused" interval="800ms" Mar 20 19:17:00.711113 kubelet[2430]: I0320 19:17:00.710613 2430 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.711447 kubelet[2430]: E0320 19:17:00.711400 2430 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.12:6443/api/v1/nodes\": dial tcp 172.24.4.12:6443: connect: connection refused" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:00.843037 kubelet[2430]: W0320 19:17:00.842828 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.843037 kubelet[2430]: E0320 19:17:00.842899 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.953789 kubelet[2430]: W0320 19:17:00.953725 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:00.953789 kubelet[2430]: E0320 19:17:00.953799 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:01.205482 kubelet[2430]: W0320 19:17:01.205278 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-1-1-f6fba67404.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:01.205482 kubelet[2430]: E0320 19:17:01.205447 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-1-1-f6fba67404.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:01.390611 kubelet[2430]: W0320 19:17:01.390149 2430 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:01.390842 kubelet[2430]: E0320 19:17:01.390696 2430 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:01.408124 kubelet[2430]: E0320 19:17:01.408027 2430 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-1-1-f6fba67404.novalocal?timeout=10s\": dial tcp 172.24.4.12:6443: connect: connection refused" interval="1.6s" Mar 20 19:17:01.514765 kubelet[2430]: I0320 19:17:01.514620 2430 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:01.515594 kubelet[2430]: E0320 19:17:01.515526 2430 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.12:6443/api/v1/nodes\": dial tcp 172.24.4.12:6443: connect: connection refused" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:01.995675 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2031820616.mount: Deactivated successfully. Mar 20 19:17:02.005302 containerd[1481]: time="2025-03-20T19:17:02.005192374Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 19:17:02.010526 containerd[1481]: time="2025-03-20T19:17:02.010202914Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 20 19:17:02.012408 containerd[1481]: time="2025-03-20T19:17:02.011957575Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 19:17:02.013624 containerd[1481]: time="2025-03-20T19:17:02.013496057Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 19:17:02.016837 containerd[1481]: time="2025-03-20T19:17:02.016744101Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 19:17:02.018529 containerd[1481]: time="2025-03-20T19:17:02.018429505Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 20 19:17:02.020821 containerd[1481]: time="2025-03-20T19:17:02.020568989Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 20 19:17:02.022414 containerd[1481]: time="2025-03-20T19:17:02.022162631Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 20 19:17:02.025294 containerd[1481]: time="2025-03-20T19:17:02.024168460Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 632.33395ms" Mar 20 19:17:02.032463 containerd[1481]: time="2025-03-20T19:17:02.032383869Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 633.069473ms" Mar 20 19:17:02.034422 containerd[1481]: time="2025-03-20T19:17:02.033840710Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 639.129679ms" Mar 20 19:17:02.083431 containerd[1481]: time="2025-03-20T19:17:02.082657479Z" level=info msg="connecting to shim 177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb" address="unix:///run/containerd/s/8e7658254daa7ae42fe18a5d170e6de3636eae9dcce8e793862e75e509d153b0" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:02.106649 containerd[1481]: time="2025-03-20T19:17:02.106331456Z" level=info msg="connecting to shim 6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a" address="unix:///run/containerd/s/225e1bcac47b4bf1d527d60c8ece0dbf183010bd13f98e9fa09e86bd0bf41838" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:02.107915 kubelet[2430]: E0320 19:17:02.107894 2430 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.12:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.12:6443: connect: connection refused Mar 20 19:17:02.111077 containerd[1481]: time="2025-03-20T19:17:02.111032642Z" level=info msg="connecting to shim 2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334" address="unix:///run/containerd/s/9a4af742cb15e415edf65fe4857b76e2b7963381292bed80e111908ae6d0c06b" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:02.135520 systemd[1]: Started cri-containerd-6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a.scope - libcontainer container 6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a. Mar 20 19:17:02.139855 systemd[1]: Started cri-containerd-177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb.scope - libcontainer container 177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb. Mar 20 19:17:02.158670 systemd[1]: Started cri-containerd-2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334.scope - libcontainer container 2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334. Mar 20 19:17:02.226717 containerd[1481]: time="2025-03-20T19:17:02.226596363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal,Uid:8e49caf03ba664b69c58839274b024a7,Namespace:kube-system,Attempt:0,} returns sandbox id \"6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a\"" Mar 20 19:17:02.231987 containerd[1481]: time="2025-03-20T19:17:02.231858693Z" level=info msg="CreateContainer within sandbox \"6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 20 19:17:02.233238 containerd[1481]: time="2025-03-20T19:17:02.233022251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal,Uid:6ef9ebbd3cf996bdddaa0133ce57dc22,Namespace:kube-system,Attempt:0,} returns sandbox id \"177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb\"" Mar 20 19:17:02.235788 containerd[1481]: time="2025-03-20T19:17:02.235749172Z" level=info msg="CreateContainer within sandbox \"177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 20 19:17:02.247331 containerd[1481]: time="2025-03-20T19:17:02.246798563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal,Uid:c7240b17fe9676a09c6232c2f4d21fed,Namespace:kube-system,Attempt:0,} returns sandbox id \"2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334\"" Mar 20 19:17:02.250509 containerd[1481]: time="2025-03-20T19:17:02.250478454Z" level=info msg="CreateContainer within sandbox \"2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 20 19:17:02.252214 containerd[1481]: time="2025-03-20T19:17:02.252189969Z" level=info msg="Container ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:02.264424 containerd[1481]: time="2025-03-20T19:17:02.264388911Z" level=info msg="Container 0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:02.268023 containerd[1481]: time="2025-03-20T19:17:02.267976437Z" level=info msg="Container 151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:02.276383 containerd[1481]: time="2025-03-20T19:17:02.276305732Z" level=info msg="CreateContainer within sandbox \"6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8\"" Mar 20 19:17:02.277381 containerd[1481]: time="2025-03-20T19:17:02.277141200Z" level=info msg="StartContainer for \"ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8\"" Mar 20 19:17:02.278208 containerd[1481]: time="2025-03-20T19:17:02.278168990Z" level=info msg="connecting to shim ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8" address="unix:///run/containerd/s/225e1bcac47b4bf1d527d60c8ece0dbf183010bd13f98e9fa09e86bd0bf41838" protocol=ttrpc version=3 Mar 20 19:17:02.287731 containerd[1481]: time="2025-03-20T19:17:02.286206325Z" level=info msg="CreateContainer within sandbox \"2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242\"" Mar 20 19:17:02.287731 containerd[1481]: time="2025-03-20T19:17:02.286666468Z" level=info msg="StartContainer for \"151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242\"" Mar 20 19:17:02.288046 containerd[1481]: time="2025-03-20T19:17:02.288016787Z" level=info msg="connecting to shim 151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242" address="unix:///run/containerd/s/9a4af742cb15e415edf65fe4857b76e2b7963381292bed80e111908ae6d0c06b" protocol=ttrpc version=3 Mar 20 19:17:02.291105 containerd[1481]: time="2025-03-20T19:17:02.291068050Z" level=info msg="CreateContainer within sandbox \"177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83\"" Mar 20 19:17:02.292426 containerd[1481]: time="2025-03-20T19:17:02.292401957Z" level=info msg="StartContainer for \"0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83\"" Mar 20 19:17:02.293417 containerd[1481]: time="2025-03-20T19:17:02.293388914Z" level=info msg="connecting to shim 0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83" address="unix:///run/containerd/s/8e7658254daa7ae42fe18a5d170e6de3636eae9dcce8e793862e75e509d153b0" protocol=ttrpc version=3 Mar 20 19:17:02.298536 systemd[1]: Started cri-containerd-ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8.scope - libcontainer container ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8. Mar 20 19:17:02.317526 systemd[1]: Started cri-containerd-151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242.scope - libcontainer container 151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242. Mar 20 19:17:02.328487 systemd[1]: Started cri-containerd-0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83.scope - libcontainer container 0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83. Mar 20 19:17:02.390693 containerd[1481]: time="2025-03-20T19:17:02.390654753Z" level=info msg="StartContainer for \"ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8\" returns successfully" Mar 20 19:17:02.414074 containerd[1481]: time="2025-03-20T19:17:02.414009687Z" level=info msg="StartContainer for \"151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242\" returns successfully" Mar 20 19:17:02.419162 containerd[1481]: time="2025-03-20T19:17:02.419128393Z" level=info msg="StartContainer for \"0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83\" returns successfully" Mar 20 19:17:03.117972 kubelet[2430]: I0320 19:17:03.117938 2430 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:04.108377 kubelet[2430]: E0320 19:17:04.107774 2430 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-9999-0-1-1-f6fba67404.novalocal\" not found" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:04.201538 kubelet[2430]: I0320 19:17:04.201298 2430 kubelet_node_status.go:76] "Successfully registered node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:04.983817 kubelet[2430]: I0320 19:17:04.983333 2430 apiserver.go:52] "Watching apiserver" Mar 20 19:17:05.002494 kubelet[2430]: I0320 19:17:05.002268 2430 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 20 19:17:05.177571 kubelet[2430]: W0320 19:17:05.177467 2430 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 19:17:06.787938 systemd[1]: Reload requested from client PID 2702 ('systemctl') (unit session-11.scope)... Mar 20 19:17:06.787975 systemd[1]: Reloading... Mar 20 19:17:06.918385 zram_generator::config[2763]: No configuration found. Mar 20 19:17:07.046317 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 20 19:17:07.184293 systemd[1]: Reloading finished in 395 ms. Mar 20 19:17:07.209509 kubelet[2430]: I0320 19:17:07.209473 2430 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 19:17:07.209500 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:17:07.220797 systemd[1]: kubelet.service: Deactivated successfully. Mar 20 19:17:07.221129 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:17:07.221250 systemd[1]: kubelet.service: Consumed 1.504s CPU time, 113.2M memory peak. Mar 20 19:17:07.224095 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 20 19:17:07.574017 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 20 19:17:07.594792 (kubelet)[2812]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 20 19:17:07.656062 kubelet[2812]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 19:17:07.656062 kubelet[2812]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 19:17:07.656062 kubelet[2812]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 19:17:07.656062 kubelet[2812]: I0320 19:17:07.656063 2812 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 19:17:07.660483 kubelet[2812]: I0320 19:17:07.660438 2812 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 20 19:17:07.660483 kubelet[2812]: I0320 19:17:07.660462 2812 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 19:17:07.660792 kubelet[2812]: I0320 19:17:07.660657 2812 server.go:927] "Client rotation is on, will bootstrap in background" Mar 20 19:17:07.662066 kubelet[2812]: I0320 19:17:07.662004 2812 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 19:17:07.663886 kubelet[2812]: I0320 19:17:07.663597 2812 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 20 19:17:07.674435 kubelet[2812]: I0320 19:17:07.673797 2812 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 20 19:17:07.674435 kubelet[2812]: I0320 19:17:07.673989 2812 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 19:17:07.674435 kubelet[2812]: I0320 19:17:07.674012 2812 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-1-1-f6fba67404.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 20 19:17:07.674435 kubelet[2812]: I0320 19:17:07.674320 2812 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 19:17:07.675648 kubelet[2812]: I0320 19:17:07.674332 2812 container_manager_linux.go:301] "Creating device plugin manager" Mar 20 19:17:07.675648 kubelet[2812]: I0320 19:17:07.674416 2812 state_mem.go:36] "Initialized new in-memory state store" Mar 20 19:17:07.675648 kubelet[2812]: I0320 19:17:07.674499 2812 kubelet.go:400] "Attempting to sync node with API server" Mar 20 19:17:07.675648 kubelet[2812]: I0320 19:17:07.674512 2812 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 19:17:07.675648 kubelet[2812]: I0320 19:17:07.674530 2812 kubelet.go:312] "Adding apiserver pod source" Mar 20 19:17:07.675648 kubelet[2812]: I0320 19:17:07.674547 2812 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.679665 2812 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.681536 2812 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.682022 2812 server.go:1264] "Started kubelet" Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.682245 2812 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.682295 2812 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.682598 2812 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 19:17:07.685521 kubelet[2812]: I0320 19:17:07.683853 2812 server.go:455] "Adding debug handlers to kubelet server" Mar 20 19:17:07.688458 kubelet[2812]: E0320 19:17:07.688432 2812 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 20 19:17:07.690594 kubelet[2812]: I0320 19:17:07.689718 2812 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 19:17:07.691170 kubelet[2812]: I0320 19:17:07.691090 2812 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 20 19:17:07.691240 kubelet[2812]: I0320 19:17:07.691208 2812 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 20 19:17:07.691456 kubelet[2812]: I0320 19:17:07.691314 2812 reconciler.go:26] "Reconciler: start to sync state" Mar 20 19:17:07.708449 kubelet[2812]: I0320 19:17:07.708000 2812 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 19:17:07.708999 kubelet[2812]: I0320 19:17:07.708809 2812 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 19:17:07.708999 kubelet[2812]: I0320 19:17:07.708837 2812 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 19:17:07.708999 kubelet[2812]: I0320 19:17:07.708852 2812 kubelet.go:2337] "Starting kubelet main sync loop" Mar 20 19:17:07.708999 kubelet[2812]: E0320 19:17:07.708892 2812 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 19:17:07.722001 kubelet[2812]: I0320 19:17:07.721979 2812 factory.go:221] Registration of the containerd container factory successfully Mar 20 19:17:07.722196 kubelet[2812]: I0320 19:17:07.722187 2812 factory.go:221] Registration of the systemd container factory successfully Mar 20 19:17:07.723458 kubelet[2812]: I0320 19:17:07.723439 2812 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 20 19:17:07.810242 kubelet[2812]: E0320 19:17:07.809470 2812 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 19:17:07.816097 kubelet[2812]: I0320 19:17:07.816075 2812 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:07.841325 kubelet[2812]: I0320 19:17:07.839915 2812 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 20 19:17:07.841995 kubelet[2812]: I0320 19:17:07.841586 2812 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 20 19:17:07.841995 kubelet[2812]: I0320 19:17:07.841613 2812 state_mem.go:36] "Initialized new in-memory state store" Mar 20 19:17:07.842534 kubelet[2812]: I0320 19:17:07.842471 2812 kubelet_node_status.go:112] "Node was previously registered" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:07.842728 kubelet[2812]: I0320 19:17:07.842716 2812 kubelet_node_status.go:76] "Successfully registered node" node="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:07.843179 kubelet[2812]: I0320 19:17:07.842659 2812 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 20 19:17:07.843179 kubelet[2812]: I0320 19:17:07.842881 2812 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 20 19:17:07.843179 kubelet[2812]: I0320 19:17:07.842918 2812 policy_none.go:49] "None policy: Start" Mar 20 19:17:07.843944 kubelet[2812]: I0320 19:17:07.843863 2812 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 19:17:07.844491 kubelet[2812]: I0320 19:17:07.844230 2812 state_mem.go:35] "Initializing new in-memory state store" Mar 20 19:17:07.844491 kubelet[2812]: I0320 19:17:07.844412 2812 state_mem.go:75] "Updated machine memory state" Mar 20 19:17:07.854219 kubelet[2812]: I0320 19:17:07.852495 2812 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 19:17:07.855211 kubelet[2812]: I0320 19:17:07.854905 2812 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 19:17:07.856771 kubelet[2812]: I0320 19:17:07.856427 2812 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 19:17:08.011660 kubelet[2812]: I0320 19:17:08.011602 2812 topology_manager.go:215] "Topology Admit Handler" podUID="8e49caf03ba664b69c58839274b024a7" podNamespace="kube-system" podName="kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.013150 kubelet[2812]: I0320 19:17:08.012111 2812 topology_manager.go:215] "Topology Admit Handler" podUID="6ef9ebbd3cf996bdddaa0133ce57dc22" podNamespace="kube-system" podName="kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.013150 kubelet[2812]: I0320 19:17:08.012234 2812 topology_manager.go:215] "Topology Admit Handler" podUID="c7240b17fe9676a09c6232c2f4d21fed" podNamespace="kube-system" podName="kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.024549 kubelet[2812]: W0320 19:17:08.024108 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 19:17:08.029459 kubelet[2812]: W0320 19:17:08.029398 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 19:17:08.032458 kubelet[2812]: W0320 19:17:08.032427 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 19:17:08.032567 kubelet[2812]: E0320 19:17:08.032536 2812 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098246 kubelet[2812]: I0320 19:17:08.098120 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098365 kubelet[2812]: I0320 19:17:08.098222 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8e49caf03ba664b69c58839274b024a7-kubeconfig\") pod \"kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"8e49caf03ba664b69c58839274b024a7\") " pod="kube-system/kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098397 kubelet[2812]: I0320 19:17:08.098303 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6ef9ebbd3cf996bdddaa0133ce57dc22-ca-certs\") pod \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"6ef9ebbd3cf996bdddaa0133ce57dc22\") " pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098435 kubelet[2812]: I0320 19:17:08.098417 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6ef9ebbd3cf996bdddaa0133ce57dc22-k8s-certs\") pod \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"6ef9ebbd3cf996bdddaa0133ce57dc22\") " pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098521 kubelet[2812]: I0320 19:17:08.098472 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6ef9ebbd3cf996bdddaa0133ce57dc22-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"6ef9ebbd3cf996bdddaa0133ce57dc22\") " pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098603 kubelet[2812]: I0320 19:17:08.098573 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098644 kubelet[2812]: I0320 19:17:08.098627 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-ca-certs\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098767 kubelet[2812]: I0320 19:17:08.098675 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.098767 kubelet[2812]: I0320 19:17:08.098721 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c7240b17fe9676a09c6232c2f4d21fed-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal\" (UID: \"c7240b17fe9676a09c6232c2f4d21fed\") " pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.679181 kubelet[2812]: I0320 19:17:08.678559 2812 apiserver.go:52] "Watching apiserver" Mar 20 19:17:08.692271 kubelet[2812]: I0320 19:17:08.692189 2812 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 20 19:17:08.795612 kubelet[2812]: W0320 19:17:08.792888 2812 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 20 19:17:08.795612 kubelet[2812]: E0320 19:17:08.793021 2812 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:08.852305 kubelet[2812]: I0320 19:17:08.852235 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-0-1-1-f6fba67404.novalocal" podStartSLOduration=3.852217826 podStartE2EDuration="3.852217826s" podCreationTimestamp="2025-03-20 19:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:17:08.836019261 +0000 UTC m=+1.230841483" watchObservedRunningTime="2025-03-20 19:17:08.852217826 +0000 UTC m=+1.247040038" Mar 20 19:17:08.852849 kubelet[2812]: I0320 19:17:08.852803 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-0-1-1-f6fba67404.novalocal" podStartSLOduration=0.852795809 podStartE2EDuration="852.795809ms" podCreationTimestamp="2025-03-20 19:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:17:08.851699969 +0000 UTC m=+1.246522182" watchObservedRunningTime="2025-03-20 19:17:08.852795809 +0000 UTC m=+1.247618011" Mar 20 19:17:08.881427 kubelet[2812]: I0320 19:17:08.881287 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-0-1-1-f6fba67404.novalocal" podStartSLOduration=0.881269766 podStartE2EDuration="881.269766ms" podCreationTimestamp="2025-03-20 19:17:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:17:08.862279366 +0000 UTC m=+1.257101589" watchObservedRunningTime="2025-03-20 19:17:08.881269766 +0000 UTC m=+1.276091968" Mar 20 19:17:13.504437 sudo[1759]: pam_unix(sudo:session): session closed for user root Mar 20 19:17:13.714837 sshd[1758]: Connection closed by 172.24.4.1 port 55710 Mar 20 19:17:13.715830 sshd-session[1755]: pam_unix(sshd:session): session closed for user core Mar 20 19:17:13.722868 systemd[1]: sshd@8-172.24.4.12:22-172.24.4.1:55710.service: Deactivated successfully. Mar 20 19:17:13.728531 systemd[1]: session-11.scope: Deactivated successfully. Mar 20 19:17:13.729281 systemd[1]: session-11.scope: Consumed 6.066s CPU time, 243.8M memory peak. Mar 20 19:17:13.734344 systemd-logind[1464]: Session 11 logged out. Waiting for processes to exit. Mar 20 19:17:13.737191 systemd-logind[1464]: Removed session 11. Mar 20 19:17:22.802827 kubelet[2812]: I0320 19:17:22.802287 2812 topology_manager.go:215] "Topology Admit Handler" podUID="ad40d6eb-c0cd-496b-a60b-c6512bc68b68" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-8sg9j" Mar 20 19:17:22.812415 systemd[1]: Created slice kubepods-besteffort-podad40d6eb_c0cd_496b_a60b_c6512bc68b68.slice - libcontainer container kubepods-besteffort-podad40d6eb_c0cd_496b_a60b_c6512bc68b68.slice. Mar 20 19:17:22.834051 kubelet[2812]: I0320 19:17:22.833946 2812 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 20 19:17:22.834591 containerd[1481]: time="2025-03-20T19:17:22.834561038Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 20 19:17:22.835041 kubelet[2812]: I0320 19:17:22.835009 2812 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 20 19:17:22.896659 kubelet[2812]: I0320 19:17:22.896529 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ad40d6eb-c0cd-496b-a60b-c6512bc68b68-var-lib-calico\") pod \"tigera-operator-6479d6dc54-8sg9j\" (UID: \"ad40d6eb-c0cd-496b-a60b-c6512bc68b68\") " pod="tigera-operator/tigera-operator-6479d6dc54-8sg9j" Mar 20 19:17:22.896659 kubelet[2812]: I0320 19:17:22.896605 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qsdk\" (UniqueName: \"kubernetes.io/projected/ad40d6eb-c0cd-496b-a60b-c6512bc68b68-kube-api-access-6qsdk\") pod \"tigera-operator-6479d6dc54-8sg9j\" (UID: \"ad40d6eb-c0cd-496b-a60b-c6512bc68b68\") " pod="tigera-operator/tigera-operator-6479d6dc54-8sg9j" Mar 20 19:17:23.120236 containerd[1481]: time="2025-03-20T19:17:23.120142208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-8sg9j,Uid:ad40d6eb-c0cd-496b-a60b-c6512bc68b68,Namespace:tigera-operator,Attempt:0,}" Mar 20 19:17:23.164522 kubelet[2812]: I0320 19:17:23.163725 2812 topology_manager.go:215] "Topology Admit Handler" podUID="8752d9af-4ac4-4ee1-8921-5413c475849f" podNamespace="kube-system" podName="kube-proxy-h9brt" Mar 20 19:17:23.169969 containerd[1481]: time="2025-03-20T19:17:23.167872981Z" level=info msg="connecting to shim 4657f3df2e7d8ddc75d73f56a565ab744e9ea90cd73e6b968b7d0919e2eba89e" address="unix:///run/containerd/s/18ac1224ee38b0aea3558fc25be8b271da2e56999ddc85db18c601efb9c3822b" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:23.177706 systemd[1]: Created slice kubepods-besteffort-pod8752d9af_4ac4_4ee1_8921_5413c475849f.slice - libcontainer container kubepods-besteffort-pod8752d9af_4ac4_4ee1_8921_5413c475849f.slice. Mar 20 19:17:23.199619 kubelet[2812]: I0320 19:17:23.199583 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8752d9af-4ac4-4ee1-8921-5413c475849f-kube-proxy\") pod \"kube-proxy-h9brt\" (UID: \"8752d9af-4ac4-4ee1-8921-5413c475849f\") " pod="kube-system/kube-proxy-h9brt" Mar 20 19:17:23.199830 kubelet[2812]: I0320 19:17:23.199777 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8752d9af-4ac4-4ee1-8921-5413c475849f-lib-modules\") pod \"kube-proxy-h9brt\" (UID: \"8752d9af-4ac4-4ee1-8921-5413c475849f\") " pod="kube-system/kube-proxy-h9brt" Mar 20 19:17:23.199830 kubelet[2812]: I0320 19:17:23.199807 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8752d9af-4ac4-4ee1-8921-5413c475849f-xtables-lock\") pod \"kube-proxy-h9brt\" (UID: \"8752d9af-4ac4-4ee1-8921-5413c475849f\") " pod="kube-system/kube-proxy-h9brt" Mar 20 19:17:23.200334 kubelet[2812]: I0320 19:17:23.200274 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79xh6\" (UniqueName: \"kubernetes.io/projected/8752d9af-4ac4-4ee1-8921-5413c475849f-kube-api-access-79xh6\") pod \"kube-proxy-h9brt\" (UID: \"8752d9af-4ac4-4ee1-8921-5413c475849f\") " pod="kube-system/kube-proxy-h9brt" Mar 20 19:17:23.221506 systemd[1]: Started cri-containerd-4657f3df2e7d8ddc75d73f56a565ab744e9ea90cd73e6b968b7d0919e2eba89e.scope - libcontainer container 4657f3df2e7d8ddc75d73f56a565ab744e9ea90cd73e6b968b7d0919e2eba89e. Mar 20 19:17:23.278632 containerd[1481]: time="2025-03-20T19:17:23.278484359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-8sg9j,Uid:ad40d6eb-c0cd-496b-a60b-c6512bc68b68,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4657f3df2e7d8ddc75d73f56a565ab744e9ea90cd73e6b968b7d0919e2eba89e\"" Mar 20 19:17:23.284391 containerd[1481]: time="2025-03-20T19:17:23.283579748Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 20 19:17:23.483899 containerd[1481]: time="2025-03-20T19:17:23.483669780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h9brt,Uid:8752d9af-4ac4-4ee1-8921-5413c475849f,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:23.537888 containerd[1481]: time="2025-03-20T19:17:23.537586187Z" level=info msg="connecting to shim 641ec437cc39c8099da5a5188169496480ed864d10902571a8b6cb0f509909e5" address="unix:///run/containerd/s/7c7e15873e357a762472aebe3b6e76940cf4f0fa5cb63d27a6469d198b62c267" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:23.585504 systemd[1]: Started cri-containerd-641ec437cc39c8099da5a5188169496480ed864d10902571a8b6cb0f509909e5.scope - libcontainer container 641ec437cc39c8099da5a5188169496480ed864d10902571a8b6cb0f509909e5. Mar 20 19:17:23.625298 containerd[1481]: time="2025-03-20T19:17:23.625236349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h9brt,Uid:8752d9af-4ac4-4ee1-8921-5413c475849f,Namespace:kube-system,Attempt:0,} returns sandbox id \"641ec437cc39c8099da5a5188169496480ed864d10902571a8b6cb0f509909e5\"" Mar 20 19:17:23.628307 containerd[1481]: time="2025-03-20T19:17:23.628230120Z" level=info msg="CreateContainer within sandbox \"641ec437cc39c8099da5a5188169496480ed864d10902571a8b6cb0f509909e5\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 20 19:17:23.645750 containerd[1481]: time="2025-03-20T19:17:23.644551975Z" level=info msg="Container 4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:23.657995 containerd[1481]: time="2025-03-20T19:17:23.657968752Z" level=info msg="CreateContainer within sandbox \"641ec437cc39c8099da5a5188169496480ed864d10902571a8b6cb0f509909e5\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54\"" Mar 20 19:17:23.658746 containerd[1481]: time="2025-03-20T19:17:23.658725443Z" level=info msg="StartContainer for \"4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54\"" Mar 20 19:17:23.660293 containerd[1481]: time="2025-03-20T19:17:23.660268462Z" level=info msg="connecting to shim 4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54" address="unix:///run/containerd/s/7c7e15873e357a762472aebe3b6e76940cf4f0fa5cb63d27a6469d198b62c267" protocol=ttrpc version=3 Mar 20 19:17:23.680627 systemd[1]: Started cri-containerd-4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54.scope - libcontainer container 4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54. Mar 20 19:17:23.726474 containerd[1481]: time="2025-03-20T19:17:23.726445002Z" level=info msg="StartContainer for \"4d85aea4da3c93e8c944dc2fea1b060d3f3d448031addbe5468209645a918b54\" returns successfully" Mar 20 19:17:23.821342 kubelet[2812]: I0320 19:17:23.821140 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-h9brt" podStartSLOduration=0.821123666 podStartE2EDuration="821.123666ms" podCreationTimestamp="2025-03-20 19:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:17:23.820449835 +0000 UTC m=+16.215272037" watchObservedRunningTime="2025-03-20 19:17:23.821123666 +0000 UTC m=+16.215945868" Mar 20 19:17:25.547620 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1847049736.mount: Deactivated successfully. Mar 20 19:17:26.138983 containerd[1481]: time="2025-03-20T19:17:26.138936013Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:26.140198 containerd[1481]: time="2025-03-20T19:17:26.140150745Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 20 19:17:26.141783 containerd[1481]: time="2025-03-20T19:17:26.141719238Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:26.144690 containerd[1481]: time="2025-03-20T19:17:26.144642393Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:26.145374 containerd[1481]: time="2025-03-20T19:17:26.145315029Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 2.861704332s" Mar 20 19:17:26.145434 containerd[1481]: time="2025-03-20T19:17:26.145378392Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 20 19:17:26.148533 containerd[1481]: time="2025-03-20T19:17:26.148442579Z" level=info msg="CreateContainer within sandbox \"4657f3df2e7d8ddc75d73f56a565ab744e9ea90cd73e6b968b7d0919e2eba89e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 20 19:17:26.163676 containerd[1481]: time="2025-03-20T19:17:26.163483101Z" level=info msg="Container c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:26.174418 containerd[1481]: time="2025-03-20T19:17:26.174357453Z" level=info msg="CreateContainer within sandbox \"4657f3df2e7d8ddc75d73f56a565ab744e9ea90cd73e6b968b7d0919e2eba89e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c\"" Mar 20 19:17:26.174978 containerd[1481]: time="2025-03-20T19:17:26.174908204Z" level=info msg="StartContainer for \"c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c\"" Mar 20 19:17:26.176051 containerd[1481]: time="2025-03-20T19:17:26.175985361Z" level=info msg="connecting to shim c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c" address="unix:///run/containerd/s/18ac1224ee38b0aea3558fc25be8b271da2e56999ddc85db18c601efb9c3822b" protocol=ttrpc version=3 Mar 20 19:17:26.198522 systemd[1]: Started cri-containerd-c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c.scope - libcontainer container c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c. Mar 20 19:17:26.231217 containerd[1481]: time="2025-03-20T19:17:26.231181409Z" level=info msg="StartContainer for \"c0138758eb3cf5b38a885978262b1061bd0cd2bea3e7b94ff2e526c94f29ad9c\" returns successfully" Mar 20 19:17:27.742447 kubelet[2812]: I0320 19:17:27.742294 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-8sg9j" podStartSLOduration=2.878422203 podStartE2EDuration="5.742259899s" podCreationTimestamp="2025-03-20 19:17:22 +0000 UTC" firstStartedPulling="2025-03-20 19:17:23.282980832 +0000 UTC m=+15.677803074" lastFinishedPulling="2025-03-20 19:17:26.146818558 +0000 UTC m=+18.541640770" observedRunningTime="2025-03-20 19:17:26.841043608 +0000 UTC m=+19.235865860" watchObservedRunningTime="2025-03-20 19:17:27.742259899 +0000 UTC m=+20.137082151" Mar 20 19:17:29.711924 kubelet[2812]: I0320 19:17:29.711873 2812 topology_manager.go:215] "Topology Admit Handler" podUID="4e2c434f-3f0f-44eb-9b02-1e9062595f29" podNamespace="calico-system" podName="calico-typha-7fdc4c8bbd-55qvz" Mar 20 19:17:29.724064 systemd[1]: Created slice kubepods-besteffort-pod4e2c434f_3f0f_44eb_9b02_1e9062595f29.slice - libcontainer container kubepods-besteffort-pod4e2c434f_3f0f_44eb_9b02_1e9062595f29.slice. Mar 20 19:17:29.744502 kubelet[2812]: I0320 19:17:29.744465 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spllw\" (UniqueName: \"kubernetes.io/projected/4e2c434f-3f0f-44eb-9b02-1e9062595f29-kube-api-access-spllw\") pod \"calico-typha-7fdc4c8bbd-55qvz\" (UID: \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\") " pod="calico-system/calico-typha-7fdc4c8bbd-55qvz" Mar 20 19:17:29.744732 kubelet[2812]: I0320 19:17:29.744512 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2c434f-3f0f-44eb-9b02-1e9062595f29-tigera-ca-bundle\") pod \"calico-typha-7fdc4c8bbd-55qvz\" (UID: \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\") " pod="calico-system/calico-typha-7fdc4c8bbd-55qvz" Mar 20 19:17:29.744732 kubelet[2812]: I0320 19:17:29.744570 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4e2c434f-3f0f-44eb-9b02-1e9062595f29-typha-certs\") pod \"calico-typha-7fdc4c8bbd-55qvz\" (UID: \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\") " pod="calico-system/calico-typha-7fdc4c8bbd-55qvz" Mar 20 19:17:29.820867 kubelet[2812]: I0320 19:17:29.819979 2812 topology_manager.go:215] "Topology Admit Handler" podUID="a9039d2e-8f0c-48c4-b46a-3a34401e53a3" podNamespace="calico-system" podName="calico-node-8zjwg" Mar 20 19:17:29.834325 systemd[1]: Created slice kubepods-besteffort-poda9039d2e_8f0c_48c4_b46a_3a34401e53a3.slice - libcontainer container kubepods-besteffort-poda9039d2e_8f0c_48c4_b46a_3a34401e53a3.slice. Mar 20 19:17:29.845842 kubelet[2812]: I0320 19:17:29.844846 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-flexvol-driver-host\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.845842 kubelet[2812]: I0320 19:17:29.844896 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-node-certs\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.845842 kubelet[2812]: I0320 19:17:29.844920 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-log-dir\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.845842 kubelet[2812]: I0320 19:17:29.844939 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-policysync\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.845842 kubelet[2812]: I0320 19:17:29.844958 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bfgp\" (UniqueName: \"kubernetes.io/projected/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-kube-api-access-7bfgp\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846075 kubelet[2812]: I0320 19:17:29.844995 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-run-calico\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846075 kubelet[2812]: I0320 19:17:29.845034 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-lib-modules\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846075 kubelet[2812]: I0320 19:17:29.845053 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-lib-calico\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846075 kubelet[2812]: I0320 19:17:29.845096 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-net-dir\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846075 kubelet[2812]: I0320 19:17:29.845128 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-tigera-ca-bundle\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846205 kubelet[2812]: I0320 19:17:29.845156 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-xtables-lock\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.846205 kubelet[2812]: I0320 19:17:29.845176 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-bin-dir\") pod \"calico-node-8zjwg\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " pod="calico-system/calico-node-8zjwg" Mar 20 19:17:29.947595 kubelet[2812]: E0320 19:17:29.947564 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.947595 kubelet[2812]: W0320 19:17:29.947585 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.947595 kubelet[2812]: E0320 19:17:29.947610 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.948068 kubelet[2812]: E0320 19:17:29.947949 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.948068 kubelet[2812]: W0320 19:17:29.947961 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.948068 kubelet[2812]: E0320 19:17:29.947972 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.948889 kubelet[2812]: E0320 19:17:29.948780 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.948889 kubelet[2812]: W0320 19:17:29.948795 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.950000 kubelet[2812]: E0320 19:17:29.949718 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.950000 kubelet[2812]: W0320 19:17:29.949728 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.950000 kubelet[2812]: E0320 19:17:29.949740 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.950826 kubelet[2812]: E0320 19:17:29.950795 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.950826 kubelet[2812]: W0320 19:17:29.950814 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.950933 kubelet[2812]: E0320 19:17:29.950836 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.951256 kubelet[2812]: E0320 19:17:29.951076 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.951256 kubelet[2812]: W0320 19:17:29.951092 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.951256 kubelet[2812]: E0320 19:17:29.951093 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.951256 kubelet[2812]: E0320 19:17:29.951103 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.956240 kubelet[2812]: E0320 19:17:29.953448 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.956410 kubelet[2812]: W0320 19:17:29.956389 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.956545 kubelet[2812]: E0320 19:17:29.956487 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.956937 kubelet[2812]: E0320 19:17:29.956889 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.956937 kubelet[2812]: W0320 19:17:29.956901 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.956937 kubelet[2812]: E0320 19:17:29.956916 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.958464 kubelet[2812]: E0320 19:17:29.958438 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.958464 kubelet[2812]: W0320 19:17:29.958460 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.959397 kubelet[2812]: E0320 19:17:29.958509 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.959397 kubelet[2812]: E0320 19:17:29.958701 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.959397 kubelet[2812]: W0320 19:17:29.958710 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.959397 kubelet[2812]: E0320 19:17:29.958720 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.979582 kubelet[2812]: E0320 19:17:29.979492 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:29.979582 kubelet[2812]: W0320 19:17:29.979512 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:29.979582 kubelet[2812]: E0320 19:17:29.979529 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:29.987250 kubelet[2812]: I0320 19:17:29.986104 2812 topology_manager.go:215] "Topology Admit Handler" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" podNamespace="calico-system" podName="csi-node-driver-h7zvw" Mar 20 19:17:29.987250 kubelet[2812]: E0320 19:17:29.986388 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:30.032836 containerd[1481]: time="2025-03-20T19:17:30.032571081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7fdc4c8bbd-55qvz,Uid:4e2c434f-3f0f-44eb-9b02-1e9062595f29,Namespace:calico-system,Attempt:0,}" Mar 20 19:17:30.038993 kubelet[2812]: E0320 19:17:30.038947 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.040024 kubelet[2812]: W0320 19:17:30.038970 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.040098 kubelet[2812]: E0320 19:17:30.040025 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.040955 kubelet[2812]: E0320 19:17:30.040314 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.040955 kubelet[2812]: W0320 19:17:30.040330 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.040955 kubelet[2812]: E0320 19:17:30.040375 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.040955 kubelet[2812]: E0320 19:17:30.040663 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.040955 kubelet[2812]: W0320 19:17:30.040674 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.040955 kubelet[2812]: E0320 19:17:30.040685 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.041018 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043250 kubelet[2812]: W0320 19:17:30.041055 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.041069 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.041307 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043250 kubelet[2812]: W0320 19:17:30.041317 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.041328 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.042417 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043250 kubelet[2812]: W0320 19:17:30.042446 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.042471 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043250 kubelet[2812]: E0320 19:17:30.042748 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043528 kubelet[2812]: W0320 19:17:30.042759 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043528 kubelet[2812]: E0320 19:17:30.042770 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043528 kubelet[2812]: E0320 19:17:30.043032 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043528 kubelet[2812]: W0320 19:17:30.043042 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043528 kubelet[2812]: E0320 19:17:30.043086 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043528 kubelet[2812]: E0320 19:17:30.043302 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043528 kubelet[2812]: W0320 19:17:30.043312 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043528 kubelet[2812]: E0320 19:17:30.043323 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.043829 kubelet[2812]: E0320 19:17:30.043599 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.043829 kubelet[2812]: W0320 19:17:30.043610 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.043829 kubelet[2812]: E0320 19:17:30.043621 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.044560 kubelet[2812]: E0320 19:17:30.044061 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.044560 kubelet[2812]: W0320 19:17:30.044077 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.044560 kubelet[2812]: E0320 19:17:30.044089 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.044560 kubelet[2812]: E0320 19:17:30.044302 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.044560 kubelet[2812]: W0320 19:17:30.044315 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.044560 kubelet[2812]: E0320 19:17:30.044326 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.044596 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047373 kubelet[2812]: W0320 19:17:30.044609 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.045077 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.045291 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047373 kubelet[2812]: W0320 19:17:30.045301 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.045312 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.045624 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047373 kubelet[2812]: W0320 19:17:30.045660 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.045674 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047373 kubelet[2812]: E0320 19:17:30.045939 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047723 kubelet[2812]: W0320 19:17:30.045951 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.047723 kubelet[2812]: E0320 19:17:30.045961 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047723 kubelet[2812]: E0320 19:17:30.046211 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047723 kubelet[2812]: W0320 19:17:30.046221 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.047723 kubelet[2812]: E0320 19:17:30.046232 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047723 kubelet[2812]: E0320 19:17:30.046806 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047723 kubelet[2812]: W0320 19:17:30.046819 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.047723 kubelet[2812]: E0320 19:17:30.046830 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.047723 kubelet[2812]: E0320 19:17:30.047306 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.047723 kubelet[2812]: W0320 19:17:30.047316 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.048055 kubelet[2812]: E0320 19:17:30.047327 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.048055 kubelet[2812]: E0320 19:17:30.047890 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.048055 kubelet[2812]: W0320 19:17:30.047900 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.048055 kubelet[2812]: E0320 19:17:30.047937 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.048612 kubelet[2812]: E0320 19:17:30.048487 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.048612 kubelet[2812]: W0320 19:17:30.048503 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.048612 kubelet[2812]: E0320 19:17:30.048515 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.048612 kubelet[2812]: I0320 19:17:30.048543 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghfgx\" (UniqueName: \"kubernetes.io/projected/90ee6a2c-61e2-4dc7-b67c-cee29f534863-kube-api-access-ghfgx\") pod \"csi-node-driver-h7zvw\" (UID: \"90ee6a2c-61e2-4dc7-b67c-cee29f534863\") " pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:30.049440 kubelet[2812]: E0320 19:17:30.049418 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.049440 kubelet[2812]: W0320 19:17:30.049436 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.049522 kubelet[2812]: E0320 19:17:30.049452 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.049522 kubelet[2812]: I0320 19:17:30.049473 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90ee6a2c-61e2-4dc7-b67c-cee29f534863-kubelet-dir\") pod \"csi-node-driver-h7zvw\" (UID: \"90ee6a2c-61e2-4dc7-b67c-cee29f534863\") " pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:30.049930 kubelet[2812]: E0320 19:17:30.049906 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.050432 kubelet[2812]: W0320 19:17:30.050402 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.050432 kubelet[2812]: E0320 19:17:30.050427 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.050529 kubelet[2812]: I0320 19:17:30.050445 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/90ee6a2c-61e2-4dc7-b67c-cee29f534863-varrun\") pod \"csi-node-driver-h7zvw\" (UID: \"90ee6a2c-61e2-4dc7-b67c-cee29f534863\") " pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:30.050764 kubelet[2812]: E0320 19:17:30.050742 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.050764 kubelet[2812]: W0320 19:17:30.050757 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.050888 kubelet[2812]: E0320 19:17:30.050863 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.050929 kubelet[2812]: I0320 19:17:30.050896 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/90ee6a2c-61e2-4dc7-b67c-cee29f534863-registration-dir\") pod \"csi-node-driver-h7zvw\" (UID: \"90ee6a2c-61e2-4dc7-b67c-cee29f534863\") " pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:30.051271 kubelet[2812]: E0320 19:17:30.051252 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.051326 kubelet[2812]: W0320 19:17:30.051266 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.051486 kubelet[2812]: E0320 19:17:30.051390 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.052018 kubelet[2812]: E0320 19:17:30.051694 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.052018 kubelet[2812]: W0320 19:17:30.051705 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.052018 kubelet[2812]: E0320 19:17:30.051722 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.052337 kubelet[2812]: E0320 19:17:30.052304 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.052337 kubelet[2812]: W0320 19:17:30.052332 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.052438 kubelet[2812]: E0320 19:17:30.052386 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.052766 kubelet[2812]: E0320 19:17:30.052745 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.052766 kubelet[2812]: W0320 19:17:30.052761 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.052850 kubelet[2812]: E0320 19:17:30.052776 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.053382 kubelet[2812]: I0320 19:17:30.052905 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/90ee6a2c-61e2-4dc7-b67c-cee29f534863-socket-dir\") pod \"csi-node-driver-h7zvw\" (UID: \"90ee6a2c-61e2-4dc7-b67c-cee29f534863\") " pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:30.053600 kubelet[2812]: E0320 19:17:30.053541 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.053600 kubelet[2812]: W0320 19:17:30.053559 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.053600 kubelet[2812]: E0320 19:17:30.053581 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.053997 kubelet[2812]: E0320 19:17:30.053976 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.053997 kubelet[2812]: W0320 19:17:30.053992 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.054081 kubelet[2812]: E0320 19:17:30.054003 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.054489 kubelet[2812]: E0320 19:17:30.054445 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.054489 kubelet[2812]: W0320 19:17:30.054459 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.054489 kubelet[2812]: E0320 19:17:30.054473 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.054792 kubelet[2812]: E0320 19:17:30.054776 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.054792 kubelet[2812]: W0320 19:17:30.054788 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.054865 kubelet[2812]: E0320 19:17:30.054798 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.055764 kubelet[2812]: E0320 19:17:30.055391 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.055764 kubelet[2812]: W0320 19:17:30.055404 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.055764 kubelet[2812]: E0320 19:17:30.055413 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.055764 kubelet[2812]: E0320 19:17:30.055545 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.055764 kubelet[2812]: W0320 19:17:30.055553 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.055764 kubelet[2812]: E0320 19:17:30.055561 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.055976 kubelet[2812]: E0320 19:17:30.055885 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.055976 kubelet[2812]: W0320 19:17:30.055895 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.055976 kubelet[2812]: E0320 19:17:30.055905 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.077476 containerd[1481]: time="2025-03-20T19:17:30.076755169Z" level=info msg="connecting to shim 9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7" address="unix:///run/containerd/s/62fc3c0634526df2df31e7bea3de71b4d718e7553848089037116fa7b53a09cd" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:30.117546 systemd[1]: Started cri-containerd-9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7.scope - libcontainer container 9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7. Mar 20 19:17:30.137915 containerd[1481]: time="2025-03-20T19:17:30.137867560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8zjwg,Uid:a9039d2e-8f0c-48c4-b46a-3a34401e53a3,Namespace:calico-system,Attempt:0,}" Mar 20 19:17:30.156486 kubelet[2812]: E0320 19:17:30.156440 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.156486 kubelet[2812]: W0320 19:17:30.156462 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.156486 kubelet[2812]: E0320 19:17:30.156479 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.156765 kubelet[2812]: E0320 19:17:30.156722 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.156765 kubelet[2812]: W0320 19:17:30.156731 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.156765 kubelet[2812]: E0320 19:17:30.156742 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.156950 kubelet[2812]: E0320 19:17:30.156930 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.156950 kubelet[2812]: W0320 19:17:30.156942 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.157029 kubelet[2812]: E0320 19:17:30.156953 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.157281 kubelet[2812]: E0320 19:17:30.157135 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.157281 kubelet[2812]: W0320 19:17:30.157151 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.157281 kubelet[2812]: E0320 19:17:30.157161 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.157844 kubelet[2812]: E0320 19:17:30.157729 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.157844 kubelet[2812]: W0320 19:17:30.157749 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.157844 kubelet[2812]: E0320 19:17:30.157773 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.158363 kubelet[2812]: E0320 19:17:30.157974 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.158434 kubelet[2812]: W0320 19:17:30.158422 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.158560 kubelet[2812]: E0320 19:17:30.158499 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.158807 kubelet[2812]: E0320 19:17:30.158718 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.158807 kubelet[2812]: W0320 19:17:30.158730 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.158807 kubelet[2812]: E0320 19:17:30.158755 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.159072 kubelet[2812]: E0320 19:17:30.159023 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.160789 kubelet[2812]: W0320 19:17:30.159129 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.160789 kubelet[2812]: E0320 19:17:30.159160 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.160992 kubelet[2812]: E0320 19:17:30.160980 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.161128 kubelet[2812]: W0320 19:17:30.161045 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.161128 kubelet[2812]: E0320 19:17:30.161078 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.161550 kubelet[2812]: E0320 19:17:30.161461 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.161550 kubelet[2812]: W0320 19:17:30.161472 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.161550 kubelet[2812]: E0320 19:17:30.161497 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.161733 kubelet[2812]: E0320 19:17:30.161721 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.161797 kubelet[2812]: W0320 19:17:30.161787 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.161917 kubelet[2812]: E0320 19:17:30.161884 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.162625 kubelet[2812]: E0320 19:17:30.162175 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.162625 kubelet[2812]: W0320 19:17:30.162187 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.162625 kubelet[2812]: E0320 19:17:30.162215 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.162983 kubelet[2812]: E0320 19:17:30.162891 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.162983 kubelet[2812]: W0320 19:17:30.162903 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.162983 kubelet[2812]: E0320 19:17:30.162931 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.163176 kubelet[2812]: E0320 19:17:30.163153 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.163312 kubelet[2812]: W0320 19:17:30.163231 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.163312 kubelet[2812]: E0320 19:17:30.163267 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.163738 kubelet[2812]: E0320 19:17:30.163539 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.163738 kubelet[2812]: W0320 19:17:30.163550 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.163738 kubelet[2812]: E0320 19:17:30.163586 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.165131 kubelet[2812]: E0320 19:17:30.163892 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.165316 kubelet[2812]: W0320 19:17:30.165204 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.165316 kubelet[2812]: E0320 19:17:30.165277 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.165567 kubelet[2812]: E0320 19:17:30.165554 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.165702 kubelet[2812]: W0320 19:17:30.165625 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.165702 kubelet[2812]: E0320 19:17:30.165669 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.165941 kubelet[2812]: E0320 19:17:30.165857 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.165941 kubelet[2812]: W0320 19:17:30.165869 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.166427 kubelet[2812]: E0320 19:17:30.166402 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.166633 kubelet[2812]: E0320 19:17:30.166543 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.166633 kubelet[2812]: W0320 19:17:30.166556 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.166633 kubelet[2812]: E0320 19:17:30.166603 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.166843 kubelet[2812]: E0320 19:17:30.166831 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.167004 kubelet[2812]: W0320 19:17:30.166902 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.167004 kubelet[2812]: E0320 19:17:30.166947 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.167239 kubelet[2812]: E0320 19:17:30.167159 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.167239 kubelet[2812]: W0320 19:17:30.167171 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.167671 kubelet[2812]: E0320 19:17:30.167518 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.167671 kubelet[2812]: E0320 19:17:30.167526 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.167671 kubelet[2812]: W0320 19:17:30.167543 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.167671 kubelet[2812]: E0320 19:17:30.167563 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.167902 kubelet[2812]: E0320 19:17:30.167890 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.167964 kubelet[2812]: W0320 19:17:30.167954 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.168282 kubelet[2812]: E0320 19:17:30.168016 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.168665 kubelet[2812]: E0320 19:17:30.168609 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.168665 kubelet[2812]: W0320 19:17:30.168621 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.168665 kubelet[2812]: E0320 19:17:30.168637 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.169760 kubelet[2812]: E0320 19:17:30.169481 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.169760 kubelet[2812]: W0320 19:17:30.169501 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.169760 kubelet[2812]: E0320 19:17:30.169516 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.189357 kubelet[2812]: E0320 19:17:30.189251 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:30.190053 kubelet[2812]: W0320 19:17:30.189572 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:30.190053 kubelet[2812]: E0320 19:17:30.189603 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:30.190275 containerd[1481]: time="2025-03-20T19:17:30.189326226Z" level=info msg="connecting to shim dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf" address="unix:///run/containerd/s/4299742a679952fdfa9653c36c4d7a1756b1af2c2d33988052baa381ea8b58cb" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:30.231579 systemd[1]: Started cri-containerd-dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf.scope - libcontainer container dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf. Mar 20 19:17:30.287967 containerd[1481]: time="2025-03-20T19:17:30.287906664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7fdc4c8bbd-55qvz,Uid:4e2c434f-3f0f-44eb-9b02-1e9062595f29,Namespace:calico-system,Attempt:0,} returns sandbox id \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\"" Mar 20 19:17:30.293965 containerd[1481]: time="2025-03-20T19:17:30.293734198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 20 19:17:30.301305 containerd[1481]: time="2025-03-20T19:17:30.301266551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8zjwg,Uid:a9039d2e-8f0c-48c4-b46a-3a34401e53a3,Namespace:calico-system,Attempt:0,} returns sandbox id \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\"" Mar 20 19:17:31.710947 kubelet[2812]: E0320 19:17:31.709914 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:33.489135 containerd[1481]: time="2025-03-20T19:17:33.489065769Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:33.490553 containerd[1481]: time="2025-03-20T19:17:33.490493554Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 20 19:17:33.491916 containerd[1481]: time="2025-03-20T19:17:33.491868356Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:33.495110 containerd[1481]: time="2025-03-20T19:17:33.494836210Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:33.495663 containerd[1481]: time="2025-03-20T19:17:33.495630567Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.20183931s" Mar 20 19:17:33.495725 containerd[1481]: time="2025-03-20T19:17:33.495660735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 20 19:17:33.497420 containerd[1481]: time="2025-03-20T19:17:33.497394748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 20 19:17:33.511864 containerd[1481]: time="2025-03-20T19:17:33.511833221Z" level=info msg="CreateContainer within sandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 19:17:33.529654 containerd[1481]: time="2025-03-20T19:17:33.528659633Z" level=info msg="Container bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:33.537557 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3393826160.mount: Deactivated successfully. Mar 20 19:17:33.549268 containerd[1481]: time="2025-03-20T19:17:33.549210995Z" level=info msg="CreateContainer within sandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\"" Mar 20 19:17:33.551019 containerd[1481]: time="2025-03-20T19:17:33.550842722Z" level=info msg="StartContainer for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\"" Mar 20 19:17:33.553557 containerd[1481]: time="2025-03-20T19:17:33.553505059Z" level=info msg="connecting to shim bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610" address="unix:///run/containerd/s/62fc3c0634526df2df31e7bea3de71b4d718e7553848089037116fa7b53a09cd" protocol=ttrpc version=3 Mar 20 19:17:33.590550 systemd[1]: Started cri-containerd-bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610.scope - libcontainer container bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610. Mar 20 19:17:33.672152 containerd[1481]: time="2025-03-20T19:17:33.672048184Z" level=info msg="StartContainer for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" returns successfully" Mar 20 19:17:33.711984 kubelet[2812]: E0320 19:17:33.711925 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:33.871626 kubelet[2812]: E0320 19:17:33.871599 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.871867 kubelet[2812]: W0320 19:17:33.871678 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.871867 kubelet[2812]: E0320 19:17:33.871700 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.872233 kubelet[2812]: E0320 19:17:33.872145 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.872233 kubelet[2812]: W0320 19:17:33.872157 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.872233 kubelet[2812]: E0320 19:17:33.872168 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.873276 kubelet[2812]: E0320 19:17:33.873264 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.873429 kubelet[2812]: W0320 19:17:33.873362 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.873429 kubelet[2812]: E0320 19:17:33.873380 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.873781 kubelet[2812]: E0320 19:17:33.873709 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.873781 kubelet[2812]: W0320 19:17:33.873722 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.873781 kubelet[2812]: E0320 19:17:33.873732 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.874379 kubelet[2812]: E0320 19:17:33.874067 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.874379 kubelet[2812]: W0320 19:17:33.874078 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.874379 kubelet[2812]: E0320 19:17:33.874088 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.875503 kubelet[2812]: E0320 19:17:33.875300 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.875503 kubelet[2812]: W0320 19:17:33.875315 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.875503 kubelet[2812]: E0320 19:17:33.875328 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.876216 kubelet[2812]: E0320 19:17:33.876146 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.876216 kubelet[2812]: W0320 19:17:33.876160 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.876216 kubelet[2812]: E0320 19:17:33.876172 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.878338 kubelet[2812]: E0320 19:17:33.876603 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.878338 kubelet[2812]: W0320 19:17:33.876615 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.878338 kubelet[2812]: E0320 19:17:33.876626 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.878338 kubelet[2812]: I0320 19:17:33.877111 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7fdc4c8bbd-55qvz" podStartSLOduration=1.67288951 podStartE2EDuration="4.877098036s" podCreationTimestamp="2025-03-20 19:17:29 +0000 UTC" firstStartedPulling="2025-03-20 19:17:30.292613581 +0000 UTC m=+22.687435794" lastFinishedPulling="2025-03-20 19:17:33.496822118 +0000 UTC m=+25.891644320" observedRunningTime="2025-03-20 19:17:33.874920119 +0000 UTC m=+26.269742321" watchObservedRunningTime="2025-03-20 19:17:33.877098036 +0000 UTC m=+26.271920248" Mar 20 19:17:33.878800 kubelet[2812]: E0320 19:17:33.878787 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.878894 kubelet[2812]: W0320 19:17:33.878881 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.879045 kubelet[2812]: E0320 19:17:33.879032 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.879370 kubelet[2812]: E0320 19:17:33.879329 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.879517 kubelet[2812]: W0320 19:17:33.879448 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.879659 kubelet[2812]: E0320 19:17:33.879587 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.879943 kubelet[2812]: E0320 19:17:33.879932 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.880185 kubelet[2812]: W0320 19:17:33.880110 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.880185 kubelet[2812]: E0320 19:17:33.880128 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.881376 kubelet[2812]: E0320 19:17:33.881110 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.881376 kubelet[2812]: W0320 19:17:33.881123 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.881376 kubelet[2812]: E0320 19:17:33.881135 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.882405 kubelet[2812]: E0320 19:17:33.882212 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.882405 kubelet[2812]: W0320 19:17:33.882225 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.882405 kubelet[2812]: E0320 19:17:33.882236 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.883651 kubelet[2812]: E0320 19:17:33.883608 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.883651 kubelet[2812]: W0320 19:17:33.883622 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.883651 kubelet[2812]: E0320 19:17:33.883635 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.884242 kubelet[2812]: E0320 19:17:33.884223 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.884242 kubelet[2812]: W0320 19:17:33.884236 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.884324 kubelet[2812]: E0320 19:17:33.884247 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.888323 kubelet[2812]: E0320 19:17:33.888302 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.888323 kubelet[2812]: W0320 19:17:33.888320 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.888441 kubelet[2812]: E0320 19:17:33.888413 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.888717 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.889979 kubelet[2812]: W0320 19:17:33.888731 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.888769 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.889061 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.889979 kubelet[2812]: W0320 19:17:33.889071 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.889097 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.889292 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.889979 kubelet[2812]: W0320 19:17:33.889301 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.889336 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.889979 kubelet[2812]: E0320 19:17:33.889570 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.890246 kubelet[2812]: W0320 19:17:33.889581 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.890246 kubelet[2812]: E0320 19:17:33.889622 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.890246 kubelet[2812]: E0320 19:17:33.889839 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.890246 kubelet[2812]: W0320 19:17:33.889847 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.890246 kubelet[2812]: E0320 19:17:33.889865 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.890246 kubelet[2812]: E0320 19:17:33.890080 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.890246 kubelet[2812]: W0320 19:17:33.890089 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.890246 kubelet[2812]: E0320 19:17:33.890102 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.890458 kubelet[2812]: E0320 19:17:33.890357 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.890458 kubelet[2812]: W0320 19:17:33.890367 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.890458 kubelet[2812]: E0320 19:17:33.890384 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.890568 kubelet[2812]: E0320 19:17:33.890545 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.890568 kubelet[2812]: W0320 19:17:33.890558 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.890660 kubelet[2812]: E0320 19:17:33.890640 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.890844 kubelet[2812]: E0320 19:17:33.890810 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.890844 kubelet[2812]: W0320 19:17:33.890824 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.890963 kubelet[2812]: E0320 19:17:33.890926 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.891161 kubelet[2812]: E0320 19:17:33.891146 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.891161 kubelet[2812]: W0320 19:17:33.891159 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.891232 kubelet[2812]: E0320 19:17:33.891173 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.891805 kubelet[2812]: E0320 19:17:33.891777 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.891805 kubelet[2812]: W0320 19:17:33.891795 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.891901 kubelet[2812]: E0320 19:17:33.891847 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.892940 kubelet[2812]: E0320 19:17:33.892659 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.892940 kubelet[2812]: W0320 19:17:33.892678 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.892940 kubelet[2812]: E0320 19:17:33.892695 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.893292 kubelet[2812]: E0320 19:17:33.893191 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.893292 kubelet[2812]: W0320 19:17:33.893278 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.893568 kubelet[2812]: E0320 19:17:33.893399 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.893855 kubelet[2812]: E0320 19:17:33.893837 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.893967 kubelet[2812]: W0320 19:17:33.893899 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.893967 kubelet[2812]: E0320 19:17:33.893917 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.894904 kubelet[2812]: E0320 19:17:33.894880 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.894904 kubelet[2812]: W0320 19:17:33.894896 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.895026 kubelet[2812]: E0320 19:17:33.894906 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.895423 kubelet[2812]: E0320 19:17:33.895401 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.895423 kubelet[2812]: W0320 19:17:33.895416 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.895508 kubelet[2812]: E0320 19:17:33.895426 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:33.896143 kubelet[2812]: E0320 19:17:33.896108 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:33.896209 kubelet[2812]: W0320 19:17:33.896154 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:33.896209 kubelet[2812]: E0320 19:17:33.896168 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.847024 kubelet[2812]: I0320 19:17:34.846932 2812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 19:17:34.891084 kubelet[2812]: E0320 19:17:34.890124 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.891084 kubelet[2812]: W0320 19:17:34.890165 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.891084 kubelet[2812]: E0320 19:17:34.890200 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.891084 kubelet[2812]: E0320 19:17:34.890667 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.891084 kubelet[2812]: W0320 19:17:34.890691 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.891084 kubelet[2812]: E0320 19:17:34.890717 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.892413 kubelet[2812]: E0320 19:17:34.892007 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.892413 kubelet[2812]: W0320 19:17:34.892036 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.892413 kubelet[2812]: E0320 19:17:34.892059 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.892961 kubelet[2812]: E0320 19:17:34.892727 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.892961 kubelet[2812]: W0320 19:17:34.892755 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.892961 kubelet[2812]: E0320 19:17:34.892778 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.893557 kubelet[2812]: E0320 19:17:34.893317 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.893557 kubelet[2812]: W0320 19:17:34.893343 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.893557 kubelet[2812]: E0320 19:17:34.893412 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.894002 kubelet[2812]: E0320 19:17:34.893973 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.894164 kubelet[2812]: W0320 19:17:34.894137 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.894569 kubelet[2812]: E0320 19:17:34.894316 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.894844 kubelet[2812]: E0320 19:17:34.894816 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.895060 kubelet[2812]: W0320 19:17:34.895033 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.895221 kubelet[2812]: E0320 19:17:34.895196 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.896004 kubelet[2812]: E0320 19:17:34.895751 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.896004 kubelet[2812]: W0320 19:17:34.895776 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.896004 kubelet[2812]: E0320 19:17:34.895798 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.896496 kubelet[2812]: E0320 19:17:34.896467 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.896819 kubelet[2812]: W0320 19:17:34.896611 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.896819 kubelet[2812]: E0320 19:17:34.896642 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.897103 kubelet[2812]: E0320 19:17:34.897076 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.897308 kubelet[2812]: W0320 19:17:34.897282 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.897714 kubelet[2812]: E0320 19:17:34.897512 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.898273 kubelet[2812]: E0320 19:17:34.898020 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.898273 kubelet[2812]: W0320 19:17:34.898050 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.898273 kubelet[2812]: E0320 19:17:34.898073 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.898722 kubelet[2812]: E0320 19:17:34.898694 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.898870 kubelet[2812]: W0320 19:17:34.898845 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.899032 kubelet[2812]: E0320 19:17:34.899006 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.899775 kubelet[2812]: E0320 19:17:34.899578 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.899775 kubelet[2812]: W0320 19:17:34.899606 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.899775 kubelet[2812]: E0320 19:17:34.899628 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.900536 kubelet[2812]: E0320 19:17:34.900250 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.900536 kubelet[2812]: W0320 19:17:34.900278 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.900536 kubelet[2812]: E0320 19:17:34.900301 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.901422 kubelet[2812]: E0320 19:17:34.901043 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.901422 kubelet[2812]: W0320 19:17:34.901071 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.901422 kubelet[2812]: E0320 19:17:34.901099 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.901961 kubelet[2812]: E0320 19:17:34.901665 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.901961 kubelet[2812]: W0320 19:17:34.901687 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.901961 kubelet[2812]: E0320 19:17:34.901708 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.902521 kubelet[2812]: E0320 19:17:34.902152 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.902521 kubelet[2812]: W0320 19:17:34.902176 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.902521 kubelet[2812]: E0320 19:17:34.902212 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.903322 kubelet[2812]: E0320 19:17:34.902616 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.903322 kubelet[2812]: W0320 19:17:34.902638 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.903322 kubelet[2812]: E0320 19:17:34.902660 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.904402 kubelet[2812]: E0320 19:17:34.903846 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.904402 kubelet[2812]: W0320 19:17:34.903913 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.904402 kubelet[2812]: E0320 19:17:34.903965 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.904730 kubelet[2812]: E0320 19:17:34.904700 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.904943 kubelet[2812]: W0320 19:17:34.904914 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.905201 kubelet[2812]: E0320 19:17:34.905139 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.906274 kubelet[2812]: E0320 19:17:34.906066 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.907079 kubelet[2812]: W0320 19:17:34.906996 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.907344 kubelet[2812]: E0320 19:17:34.907292 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.907649 kubelet[2812]: E0320 19:17:34.907604 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.907649 kubelet[2812]: W0320 19:17:34.907640 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.908438 kubelet[2812]: E0320 19:17:34.907970 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.908438 kubelet[2812]: E0320 19:17:34.908106 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.908438 kubelet[2812]: W0320 19:17:34.908125 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.908438 kubelet[2812]: E0320 19:17:34.908252 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.908730 kubelet[2812]: E0320 19:17:34.908715 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.908799 kubelet[2812]: W0320 19:17:34.908736 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.908876 kubelet[2812]: E0320 19:17:34.908801 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.909631 kubelet[2812]: E0320 19:17:34.909188 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.909631 kubelet[2812]: W0320 19:17:34.909221 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.909631 kubelet[2812]: E0320 19:17:34.909250 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.909877 kubelet[2812]: E0320 19:17:34.909806 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.909877 kubelet[2812]: W0320 19:17:34.909832 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.910235 kubelet[2812]: E0320 19:17:34.910081 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.910536 kubelet[2812]: E0320 19:17:34.910500 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.910536 kubelet[2812]: W0320 19:17:34.910532 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.911714 kubelet[2812]: E0320 19:17:34.911279 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.911843 kubelet[2812]: E0320 19:17:34.911783 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.911843 kubelet[2812]: W0320 19:17:34.911806 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.912017 kubelet[2812]: E0320 19:17:34.911836 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.913009 kubelet[2812]: E0320 19:17:34.912950 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.913009 kubelet[2812]: W0320 19:17:34.912984 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.913224 kubelet[2812]: E0320 19:17:34.913111 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.913592 kubelet[2812]: E0320 19:17:34.913510 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.913592 kubelet[2812]: W0320 19:17:34.913571 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.913774 kubelet[2812]: E0320 19:17:34.913699 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.914080 kubelet[2812]: E0320 19:17:34.914046 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.914080 kubelet[2812]: W0320 19:17:34.914076 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.914223 kubelet[2812]: E0320 19:17:34.914099 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.914574 kubelet[2812]: E0320 19:17:34.914539 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.914574 kubelet[2812]: W0320 19:17:34.914571 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.914574 kubelet[2812]: E0320 19:17:34.914593 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:34.915613 kubelet[2812]: E0320 19:17:34.915553 2812 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 20 19:17:34.915747 kubelet[2812]: W0320 19:17:34.915590 2812 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 20 19:17:34.915747 kubelet[2812]: E0320 19:17:34.915663 2812 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 20 19:17:35.547596 containerd[1481]: time="2025-03-20T19:17:35.547486556Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:35.548895 containerd[1481]: time="2025-03-20T19:17:35.548835077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 20 19:17:35.550771 containerd[1481]: time="2025-03-20T19:17:35.550432977Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:35.560199 containerd[1481]: time="2025-03-20T19:17:35.560132719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:35.561617 containerd[1481]: time="2025-03-20T19:17:35.561558738Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 2.064051023s" Mar 20 19:17:35.561788 containerd[1481]: time="2025-03-20T19:17:35.561710299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 20 19:17:35.566404 containerd[1481]: time="2025-03-20T19:17:35.565386210Z" level=info msg="CreateContainer within sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 19:17:35.580403 containerd[1481]: time="2025-03-20T19:17:35.579526293Z" level=info msg="Container 22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:35.597450 containerd[1481]: time="2025-03-20T19:17:35.597405228Z" level=info msg="CreateContainer within sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\"" Mar 20 19:17:35.598223 containerd[1481]: time="2025-03-20T19:17:35.597980192Z" level=info msg="StartContainer for \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\"" Mar 20 19:17:35.599791 containerd[1481]: time="2025-03-20T19:17:35.599764850Z" level=info msg="connecting to shim 22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1" address="unix:///run/containerd/s/4299742a679952fdfa9653c36c4d7a1756b1af2c2d33988052baa381ea8b58cb" protocol=ttrpc version=3 Mar 20 19:17:35.624520 systemd[1]: Started cri-containerd-22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1.scope - libcontainer container 22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1. Mar 20 19:17:35.670667 containerd[1481]: time="2025-03-20T19:17:35.670633686Z" level=info msg="StartContainer for \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" returns successfully" Mar 20 19:17:35.681077 systemd[1]: cri-containerd-22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1.scope: Deactivated successfully. Mar 20 19:17:35.684673 containerd[1481]: time="2025-03-20T19:17:35.684618159Z" level=info msg="TaskExit event in podsandbox handler container_id:\"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" id:\"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" pid:3487 exited_at:{seconds:1742498255 nanos:684145181}" Mar 20 19:17:35.684794 containerd[1481]: time="2025-03-20T19:17:35.684747779Z" level=info msg="received exit event container_id:\"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" id:\"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" pid:3487 exited_at:{seconds:1742498255 nanos:684145181}" Mar 20 19:17:35.708976 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1-rootfs.mount: Deactivated successfully. Mar 20 19:17:35.711920 kubelet[2812]: E0320 19:17:35.711882 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:36.871461 containerd[1481]: time="2025-03-20T19:17:36.869394059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 20 19:17:37.711520 kubelet[2812]: E0320 19:17:37.710857 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:39.713959 kubelet[2812]: E0320 19:17:39.712292 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:41.710390 kubelet[2812]: E0320 19:17:41.710162 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:42.680196 containerd[1481]: time="2025-03-20T19:17:42.679893819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:42.681617 containerd[1481]: time="2025-03-20T19:17:42.681373215Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 20 19:17:42.683018 containerd[1481]: time="2025-03-20T19:17:42.682963185Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:42.685503 containerd[1481]: time="2025-03-20T19:17:42.685462528Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:42.686187 containerd[1481]: time="2025-03-20T19:17:42.686050426Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 5.816579339s" Mar 20 19:17:42.686187 containerd[1481]: time="2025-03-20T19:17:42.686087577Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 20 19:17:42.689615 containerd[1481]: time="2025-03-20T19:17:42.689577581Z" level=info msg="CreateContainer within sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 19:17:42.707040 containerd[1481]: time="2025-03-20T19:17:42.704522803Z" level=info msg="Container a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:42.709797 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3782736372.mount: Deactivated successfully. Mar 20 19:17:42.719274 containerd[1481]: time="2025-03-20T19:17:42.719120059Z" level=info msg="CreateContainer within sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\"" Mar 20 19:17:42.719274 containerd[1481]: time="2025-03-20T19:17:42.719717655Z" level=info msg="StartContainer for \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\"" Mar 20 19:17:42.721946 containerd[1481]: time="2025-03-20T19:17:42.721905581Z" level=info msg="connecting to shim a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5" address="unix:///run/containerd/s/4299742a679952fdfa9653c36c4d7a1756b1af2c2d33988052baa381ea8b58cb" protocol=ttrpc version=3 Mar 20 19:17:42.747514 systemd[1]: Started cri-containerd-a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5.scope - libcontainer container a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5. Mar 20 19:17:42.791033 containerd[1481]: time="2025-03-20T19:17:42.790995676Z" level=info msg="StartContainer for \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" returns successfully" Mar 20 19:17:43.711216 kubelet[2812]: E0320 19:17:43.710457 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:43.842454 containerd[1481]: time="2025-03-20T19:17:43.842310958Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 20 19:17:43.847110 systemd[1]: cri-containerd-a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5.scope: Deactivated successfully. Mar 20 19:17:43.847952 systemd[1]: cri-containerd-a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5.scope: Consumed 611ms CPU time, 174.9M memory peak, 154M written to disk. Mar 20 19:17:43.853510 containerd[1481]: time="2025-03-20T19:17:43.853175563Z" level=info msg="received exit event container_id:\"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" id:\"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" pid:3545 exited_at:{seconds:1742498263 nanos:852737597}" Mar 20 19:17:43.853510 containerd[1481]: time="2025-03-20T19:17:43.853444286Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" id:\"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" pid:3545 exited_at:{seconds:1742498263 nanos:852737597}" Mar 20 19:17:43.897182 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5-rootfs.mount: Deactivated successfully. Mar 20 19:17:43.910362 kubelet[2812]: I0320 19:17:43.910293 2812 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 20 19:17:44.261778 kubelet[2812]: I0320 19:17:44.261616 2812 topology_manager.go:215] "Topology Admit Handler" podUID="6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40" podNamespace="kube-system" podName="coredns-7db6d8ff4d-6txdn" Mar 20 19:17:44.271789 kubelet[2812]: I0320 19:17:44.271024 2812 topology_manager.go:215] "Topology Admit Handler" podUID="5b1b32a2-c65c-4c93-a4b3-5346ab5325b0" podNamespace="kube-system" podName="coredns-7db6d8ff4d-lkvbx" Mar 20 19:17:44.273598 kubelet[2812]: I0320 19:17:44.273553 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9b5w\" (UniqueName: \"kubernetes.io/projected/5b1b32a2-c65c-4c93-a4b3-5346ab5325b0-kube-api-access-b9b5w\") pod \"coredns-7db6d8ff4d-lkvbx\" (UID: \"5b1b32a2-c65c-4c93-a4b3-5346ab5325b0\") " pod="kube-system/coredns-7db6d8ff4d-lkvbx" Mar 20 19:17:44.273962 kubelet[2812]: I0320 19:17:44.273871 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b92kk\" (UniqueName: \"kubernetes.io/projected/6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40-kube-api-access-b92kk\") pod \"coredns-7db6d8ff4d-6txdn\" (UID: \"6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40\") " pod="kube-system/coredns-7db6d8ff4d-6txdn" Mar 20 19:17:44.274306 kubelet[2812]: I0320 19:17:44.274127 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b1b32a2-c65c-4c93-a4b3-5346ab5325b0-config-volume\") pod \"coredns-7db6d8ff4d-lkvbx\" (UID: \"5b1b32a2-c65c-4c93-a4b3-5346ab5325b0\") " pod="kube-system/coredns-7db6d8ff4d-lkvbx" Mar 20 19:17:44.274306 kubelet[2812]: I0320 19:17:44.274244 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40-config-volume\") pod \"coredns-7db6d8ff4d-6txdn\" (UID: \"6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40\") " pod="kube-system/coredns-7db6d8ff4d-6txdn" Mar 20 19:17:44.275446 kubelet[2812]: I0320 19:17:44.275074 2812 topology_manager.go:215] "Topology Admit Handler" podUID="357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" podNamespace="calico-system" podName="calico-kube-controllers-98754bff4-5m8cp" Mar 20 19:17:44.289492 kubelet[2812]: I0320 19:17:44.287686 2812 topology_manager.go:215] "Topology Admit Handler" podUID="5673035b-c7fa-4379-9bba-5327f7dcb354" podNamespace="calico-apiserver" podName="calico-apiserver-594b5557b6-x9x55" Mar 20 19:17:44.292398 kubelet[2812]: I0320 19:17:44.291207 2812 topology_manager.go:215] "Topology Admit Handler" podUID="a6d2a3a2-f4d5-444a-902c-80afb149202f" podNamespace="calico-apiserver" podName="calico-apiserver-594b5557b6-ljpn7" Mar 20 19:17:44.296293 systemd[1]: Created slice kubepods-burstable-pod6257ddcf_2a54_49d8_9f21_1a5fdc8c8b40.slice - libcontainer container kubepods-burstable-pod6257ddcf_2a54_49d8_9f21_1a5fdc8c8b40.slice. Mar 20 19:17:44.316618 systemd[1]: Created slice kubepods-burstable-pod5b1b32a2_c65c_4c93_a4b3_5346ab5325b0.slice - libcontainer container kubepods-burstable-pod5b1b32a2_c65c_4c93_a4b3_5346ab5325b0.slice. Mar 20 19:17:44.329875 systemd[1]: Created slice kubepods-besteffort-pod357e32cb_acd2_40bd_a1ff_f4c6a737b6d0.slice - libcontainer container kubepods-besteffort-pod357e32cb_acd2_40bd_a1ff_f4c6a737b6d0.slice. Mar 20 19:17:44.336319 systemd[1]: Created slice kubepods-besteffort-pod5673035b_c7fa_4379_9bba_5327f7dcb354.slice - libcontainer container kubepods-besteffort-pod5673035b_c7fa_4379_9bba_5327f7dcb354.slice. Mar 20 19:17:44.341428 systemd[1]: Created slice kubepods-besteffort-poda6d2a3a2_f4d5_444a_902c_80afb149202f.slice - libcontainer container kubepods-besteffort-poda6d2a3a2_f4d5_444a_902c_80afb149202f.slice. Mar 20 19:17:44.375191 kubelet[2812]: I0320 19:17:44.375137 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s29sk\" (UniqueName: \"kubernetes.io/projected/5673035b-c7fa-4379-9bba-5327f7dcb354-kube-api-access-s29sk\") pod \"calico-apiserver-594b5557b6-x9x55\" (UID: \"5673035b-c7fa-4379-9bba-5327f7dcb354\") " pod="calico-apiserver/calico-apiserver-594b5557b6-x9x55" Mar 20 19:17:44.376233 kubelet[2812]: I0320 19:17:44.375264 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-tigera-ca-bundle\") pod \"calico-kube-controllers-98754bff4-5m8cp\" (UID: \"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0\") " pod="calico-system/calico-kube-controllers-98754bff4-5m8cp" Mar 20 19:17:44.376233 kubelet[2812]: I0320 19:17:44.375323 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5673035b-c7fa-4379-9bba-5327f7dcb354-calico-apiserver-certs\") pod \"calico-apiserver-594b5557b6-x9x55\" (UID: \"5673035b-c7fa-4379-9bba-5327f7dcb354\") " pod="calico-apiserver/calico-apiserver-594b5557b6-x9x55" Mar 20 19:17:44.376233 kubelet[2812]: I0320 19:17:44.375416 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4lrt\" (UniqueName: \"kubernetes.io/projected/a6d2a3a2-f4d5-444a-902c-80afb149202f-kube-api-access-d4lrt\") pod \"calico-apiserver-594b5557b6-ljpn7\" (UID: \"a6d2a3a2-f4d5-444a-902c-80afb149202f\") " pod="calico-apiserver/calico-apiserver-594b5557b6-ljpn7" Mar 20 19:17:44.376233 kubelet[2812]: I0320 19:17:44.375500 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a6d2a3a2-f4d5-444a-902c-80afb149202f-calico-apiserver-certs\") pod \"calico-apiserver-594b5557b6-ljpn7\" (UID: \"a6d2a3a2-f4d5-444a-902c-80afb149202f\") " pod="calico-apiserver/calico-apiserver-594b5557b6-ljpn7" Mar 20 19:17:44.376233 kubelet[2812]: I0320 19:17:44.376056 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhj8v\" (UniqueName: \"kubernetes.io/projected/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-kube-api-access-mhj8v\") pod \"calico-kube-controllers-98754bff4-5m8cp\" (UID: \"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0\") " pod="calico-system/calico-kube-controllers-98754bff4-5m8cp" Mar 20 19:17:44.610454 containerd[1481]: time="2025-03-20T19:17:44.610223564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6txdn,Uid:6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:44.625590 containerd[1481]: time="2025-03-20T19:17:44.625340471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-lkvbx,Uid:5b1b32a2-c65c-4c93-a4b3-5346ab5325b0,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:44.634944 containerd[1481]: time="2025-03-20T19:17:44.634844278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98754bff4-5m8cp,Uid:357e32cb-acd2-40bd-a1ff-f4c6a737b6d0,Namespace:calico-system,Attempt:0,}" Mar 20 19:17:44.641021 containerd[1481]: time="2025-03-20T19:17:44.640885975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-x9x55,Uid:5673035b-c7fa-4379-9bba-5327f7dcb354,Namespace:calico-apiserver,Attempt:0,}" Mar 20 19:17:44.648101 containerd[1481]: time="2025-03-20T19:17:44.647993304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-ljpn7,Uid:a6d2a3a2-f4d5-444a-902c-80afb149202f,Namespace:calico-apiserver,Attempt:0,}" Mar 20 19:17:44.942376 containerd[1481]: time="2025-03-20T19:17:44.941430197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 20 19:17:45.074105 containerd[1481]: time="2025-03-20T19:17:45.074060472Z" level=error msg="Failed to destroy network for sandbox \"04ccc4bf3171b59e118f5c0839ed8200a520088337ca881c2cfb427da8ce5b44\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.079241 systemd[1]: run-netns-cni\x2d13b92766\x2dd4ca\x2d1934\x2db8ce\x2d1cddaab27012.mount: Deactivated successfully. Mar 20 19:17:45.083140 containerd[1481]: time="2025-03-20T19:17:45.083098961Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-lkvbx,Uid:5b1b32a2-c65c-4c93-a4b3-5346ab5325b0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ccc4bf3171b59e118f5c0839ed8200a520088337ca881c2cfb427da8ce5b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.083606 kubelet[2812]: E0320 19:17:45.083454 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ccc4bf3171b59e118f5c0839ed8200a520088337ca881c2cfb427da8ce5b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.083847 kubelet[2812]: E0320 19:17:45.083602 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ccc4bf3171b59e118f5c0839ed8200a520088337ca881c2cfb427da8ce5b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-lkvbx" Mar 20 19:17:45.083847 kubelet[2812]: E0320 19:17:45.083653 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"04ccc4bf3171b59e118f5c0839ed8200a520088337ca881c2cfb427da8ce5b44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-lkvbx" Mar 20 19:17:45.083847 kubelet[2812]: E0320 19:17:45.083706 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-lkvbx_kube-system(5b1b32a2-c65c-4c93-a4b3-5346ab5325b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-lkvbx_kube-system(5b1b32a2-c65c-4c93-a4b3-5346ab5325b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"04ccc4bf3171b59e118f5c0839ed8200a520088337ca881c2cfb427da8ce5b44\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-lkvbx" podUID="5b1b32a2-c65c-4c93-a4b3-5346ab5325b0" Mar 20 19:17:45.085328 containerd[1481]: time="2025-03-20T19:17:45.085275781Z" level=error msg="Failed to destroy network for sandbox \"244f0a0e3b6cd09f1f64e299a08a4e0c6e9f396c2e72f1601b5cd44c62ac35b0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.087309 containerd[1481]: time="2025-03-20T19:17:45.087278610Z" level=error msg="Failed to destroy network for sandbox \"37e5aecfb678bc33ae159b0454f8b7b7d4ca728218d607a82e4c8190bd703a9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.089214 containerd[1481]: time="2025-03-20T19:17:45.088977720Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98754bff4-5m8cp,Uid:357e32cb-acd2-40bd-a1ff-f4c6a737b6d0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"244f0a0e3b6cd09f1f64e299a08a4e0c6e9f396c2e72f1601b5cd44c62ac35b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.090521 kubelet[2812]: E0320 19:17:45.089488 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"244f0a0e3b6cd09f1f64e299a08a4e0c6e9f396c2e72f1601b5cd44c62ac35b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.090521 kubelet[2812]: E0320 19:17:45.089538 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"244f0a0e3b6cd09f1f64e299a08a4e0c6e9f396c2e72f1601b5cd44c62ac35b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-98754bff4-5m8cp" Mar 20 19:17:45.090521 kubelet[2812]: E0320 19:17:45.089579 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"244f0a0e3b6cd09f1f64e299a08a4e0c6e9f396c2e72f1601b5cd44c62ac35b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-98754bff4-5m8cp" Mar 20 19:17:45.090654 kubelet[2812]: E0320 19:17:45.089621 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-98754bff4-5m8cp_calico-system(357e32cb-acd2-40bd-a1ff-f4c6a737b6d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-98754bff4-5m8cp_calico-system(357e32cb-acd2-40bd-a1ff-f4c6a737b6d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"244f0a0e3b6cd09f1f64e299a08a4e0c6e9f396c2e72f1601b5cd44c62ac35b0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-98754bff4-5m8cp" podUID="357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" Mar 20 19:17:45.091128 containerd[1481]: time="2025-03-20T19:17:45.090942128Z" level=error msg="Failed to destroy network for sandbox \"8e4851cfb49dc5fefa30b5d0a81318c56a8f3a750daadbd0b7df0172f0588f34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.091317 containerd[1481]: time="2025-03-20T19:17:45.091269351Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-x9x55,Uid:5673035b-c7fa-4379-9bba-5327f7dcb354,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"37e5aecfb678bc33ae159b0454f8b7b7d4ca728218d607a82e4c8190bd703a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.092084 kubelet[2812]: E0320 19:17:45.091814 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37e5aecfb678bc33ae159b0454f8b7b7d4ca728218d607a82e4c8190bd703a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.092084 kubelet[2812]: E0320 19:17:45.091899 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37e5aecfb678bc33ae159b0454f8b7b7d4ca728218d607a82e4c8190bd703a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594b5557b6-x9x55" Mar 20 19:17:45.092084 kubelet[2812]: E0320 19:17:45.091929 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"37e5aecfb678bc33ae159b0454f8b7b7d4ca728218d607a82e4c8190bd703a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594b5557b6-x9x55" Mar 20 19:17:45.092304 kubelet[2812]: E0320 19:17:45.092004 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-594b5557b6-x9x55_calico-apiserver(5673035b-c7fa-4379-9bba-5327f7dcb354)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-594b5557b6-x9x55_calico-apiserver(5673035b-c7fa-4379-9bba-5327f7dcb354)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"37e5aecfb678bc33ae159b0454f8b7b7d4ca728218d607a82e4c8190bd703a9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-594b5557b6-x9x55" podUID="5673035b-c7fa-4379-9bba-5327f7dcb354" Mar 20 19:17:45.093245 systemd[1]: run-netns-cni\x2d4fae06ad\x2ded85\x2d8703\x2da7ff\x2d0b2c941c3484.mount: Deactivated successfully. Mar 20 19:17:45.094536 containerd[1481]: time="2025-03-20T19:17:45.093320730Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6txdn,Uid:6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e4851cfb49dc5fefa30b5d0a81318c56a8f3a750daadbd0b7df0172f0588f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.093615 systemd[1]: run-netns-cni\x2d3ae2738e\x2db56e\x2debfe\x2ddf0f\x2d14dbd02d9a3e.mount: Deactivated successfully. Mar 20 19:17:45.095005 kubelet[2812]: E0320 19:17:45.094768 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e4851cfb49dc5fefa30b5d0a81318c56a8f3a750daadbd0b7df0172f0588f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.095005 kubelet[2812]: E0320 19:17:45.094840 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e4851cfb49dc5fefa30b5d0a81318c56a8f3a750daadbd0b7df0172f0588f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-6txdn" Mar 20 19:17:45.095005 kubelet[2812]: E0320 19:17:45.094863 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e4851cfb49dc5fefa30b5d0a81318c56a8f3a750daadbd0b7df0172f0588f34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-6txdn" Mar 20 19:17:45.095254 kubelet[2812]: E0320 19:17:45.095191 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-6txdn_kube-system(6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-6txdn_kube-system(6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e4851cfb49dc5fefa30b5d0a81318c56a8f3a750daadbd0b7df0172f0588f34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-6txdn" podUID="6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40" Mar 20 19:17:45.099909 systemd[1]: run-netns-cni\x2d7ee1322e\x2d39ef\x2dafc2\x2d7891\x2d07d32e56e3f4.mount: Deactivated successfully. Mar 20 19:17:45.123285 containerd[1481]: time="2025-03-20T19:17:45.123248169Z" level=error msg="Failed to destroy network for sandbox \"14dcc7b1077ec4b6194b6740b36babc778f1c2f7b0d0e326ae2064f1d0162b45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.125168 containerd[1481]: time="2025-03-20T19:17:45.125137248Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-ljpn7,Uid:a6d2a3a2-f4d5-444a-902c-80afb149202f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14dcc7b1077ec4b6194b6740b36babc778f1c2f7b0d0e326ae2064f1d0162b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.125487 kubelet[2812]: E0320 19:17:45.125453 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14dcc7b1077ec4b6194b6740b36babc778f1c2f7b0d0e326ae2064f1d0162b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.125640 kubelet[2812]: E0320 19:17:45.125620 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14dcc7b1077ec4b6194b6740b36babc778f1c2f7b0d0e326ae2064f1d0162b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594b5557b6-ljpn7" Mar 20 19:17:45.125885 kubelet[2812]: E0320 19:17:45.125776 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14dcc7b1077ec4b6194b6740b36babc778f1c2f7b0d0e326ae2064f1d0162b45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-594b5557b6-ljpn7" Mar 20 19:17:45.126018 kubelet[2812]: E0320 19:17:45.125861 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-594b5557b6-ljpn7_calico-apiserver(a6d2a3a2-f4d5-444a-902c-80afb149202f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-594b5557b6-ljpn7_calico-apiserver(a6d2a3a2-f4d5-444a-902c-80afb149202f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14dcc7b1077ec4b6194b6740b36babc778f1c2f7b0d0e326ae2064f1d0162b45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-594b5557b6-ljpn7" podUID="a6d2a3a2-f4d5-444a-902c-80afb149202f" Mar 20 19:17:45.724838 systemd[1]: Created slice kubepods-besteffort-pod90ee6a2c_61e2_4dc7_b67c_cee29f534863.slice - libcontainer container kubepods-besteffort-pod90ee6a2c_61e2_4dc7_b67c_cee29f534863.slice. Mar 20 19:17:45.733013 containerd[1481]: time="2025-03-20T19:17:45.732942462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h7zvw,Uid:90ee6a2c-61e2-4dc7-b67c-cee29f534863,Namespace:calico-system,Attempt:0,}" Mar 20 19:17:45.828878 containerd[1481]: time="2025-03-20T19:17:45.828753804Z" level=error msg="Failed to destroy network for sandbox \"97f14716e64106a0a03239e1af9fd0877e3ca000f73b25a97818cf6a92af0ed0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.830470 containerd[1481]: time="2025-03-20T19:17:45.830389106Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h7zvw,Uid:90ee6a2c-61e2-4dc7-b67c-cee29f534863,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"97f14716e64106a0a03239e1af9fd0877e3ca000f73b25a97818cf6a92af0ed0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.830947 kubelet[2812]: E0320 19:17:45.830600 2812 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97f14716e64106a0a03239e1af9fd0877e3ca000f73b25a97818cf6a92af0ed0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 20 19:17:45.830947 kubelet[2812]: E0320 19:17:45.830657 2812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97f14716e64106a0a03239e1af9fd0877e3ca000f73b25a97818cf6a92af0ed0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:45.830947 kubelet[2812]: E0320 19:17:45.830679 2812 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"97f14716e64106a0a03239e1af9fd0877e3ca000f73b25a97818cf6a92af0ed0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-h7zvw" Mar 20 19:17:45.831227 kubelet[2812]: E0320 19:17:45.830727 2812 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-h7zvw_calico-system(90ee6a2c-61e2-4dc7-b67c-cee29f534863)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-h7zvw_calico-system(90ee6a2c-61e2-4dc7-b67c-cee29f534863)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"97f14716e64106a0a03239e1af9fd0877e3ca000f73b25a97818cf6a92af0ed0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-h7zvw" podUID="90ee6a2c-61e2-4dc7-b67c-cee29f534863" Mar 20 19:17:45.898658 systemd[1]: run-netns-cni\x2d87ef43f5\x2d9fca\x2d3bea\x2da3ca\x2d269a6084ca0a.mount: Deactivated successfully. Mar 20 19:17:51.515422 kubelet[2812]: I0320 19:17:51.515314 2812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 19:17:53.894745 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1692224896.mount: Deactivated successfully. Mar 20 19:17:54.250530 containerd[1481]: time="2025-03-20T19:17:54.248453014Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:54.251154 containerd[1481]: time="2025-03-20T19:17:54.250534740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 20 19:17:54.253638 containerd[1481]: time="2025-03-20T19:17:54.253579737Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:54.260913 containerd[1481]: time="2025-03-20T19:17:54.260855572Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:17:54.262950 containerd[1481]: time="2025-03-20T19:17:54.262068245Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 9.320358745s" Mar 20 19:17:54.263681 containerd[1481]: time="2025-03-20T19:17:54.263636087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 20 19:17:54.294925 containerd[1481]: time="2025-03-20T19:17:54.294861551Z" level=info msg="CreateContainer within sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 19:17:54.325404 containerd[1481]: time="2025-03-20T19:17:54.321954648Z" level=info msg="Container aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:17:54.337696 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3149709175.mount: Deactivated successfully. Mar 20 19:17:54.356717 containerd[1481]: time="2025-03-20T19:17:54.356653574Z" level=info msg="CreateContainer within sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\"" Mar 20 19:17:54.360417 containerd[1481]: time="2025-03-20T19:17:54.358134966Z" level=info msg="StartContainer for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\"" Mar 20 19:17:54.362483 containerd[1481]: time="2025-03-20T19:17:54.362429502Z" level=info msg="connecting to shim aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c" address="unix:///run/containerd/s/4299742a679952fdfa9653c36c4d7a1756b1af2c2d33988052baa381ea8b58cb" protocol=ttrpc version=3 Mar 20 19:17:54.401558 systemd[1]: Started cri-containerd-aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c.scope - libcontainer container aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c. Mar 20 19:17:54.461241 containerd[1481]: time="2025-03-20T19:17:54.461209063Z" level=info msg="StartContainer for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" returns successfully" Mar 20 19:17:54.529190 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 20 19:17:54.529329 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 20 19:17:55.005999 kubelet[2812]: I0320 19:17:55.005873 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8zjwg" podStartSLOduration=2.043374908 podStartE2EDuration="26.005842898s" podCreationTimestamp="2025-03-20 19:17:29 +0000 UTC" firstStartedPulling="2025-03-20 19:17:30.302742652 +0000 UTC m=+22.697564854" lastFinishedPulling="2025-03-20 19:17:54.265210582 +0000 UTC m=+46.660032844" observedRunningTime="2025-03-20 19:17:54.9987152 +0000 UTC m=+47.393537412" watchObservedRunningTime="2025-03-20 19:17:55.005842898 +0000 UTC m=+47.400665150" Mar 20 19:17:55.711432 containerd[1481]: time="2025-03-20T19:17:55.710815237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-ljpn7,Uid:a6d2a3a2-f4d5-444a-902c-80afb149202f,Namespace:calico-apiserver,Attempt:0,}" Mar 20 19:17:55.947282 systemd-networkd[1393]: calibc2bf404381: Link UP Mar 20 19:17:55.948326 systemd-networkd[1393]: calibc2bf404381: Gained carrier Mar 20 19:17:55.970683 containerd[1481]: 2025-03-20 19:17:55.775 [INFO][3831] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 20 19:17:55.970683 containerd[1481]: 2025-03-20 19:17:55.813 [INFO][3831] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0 calico-apiserver-594b5557b6- calico-apiserver a6d2a3a2-f4d5-444a-902c-80afb149202f 739 0 2025-03-20 19:17:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:594b5557b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal calico-apiserver-594b5557b6-ljpn7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibc2bf404381 [] []}} ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-" Mar 20 19:17:55.970683 containerd[1481]: 2025-03-20 19:17:55.813 [INFO][3831] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:55.970683 containerd[1481]: 2025-03-20 19:17:55.872 [INFO][3844] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" HandleID="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.890 [INFO][3844] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" HandleID="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003194b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"calico-apiserver-594b5557b6-ljpn7", "timestamp":"2025-03-20 19:17:55.872186311 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.890 [INFO][3844] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.891 [INFO][3844] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.891 [INFO][3844] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.894 [INFO][3844] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.900 [INFO][3844] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.907 [INFO][3844] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.910 [INFO][3844] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.970982 containerd[1481]: 2025-03-20 19:17:55.912 [INFO][3844] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.913 [INFO][3844] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.915 [INFO][3844] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.923 [INFO][3844] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.930 [INFO][3844] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.129/26] block=192.168.96.128/26 handle="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.930 [INFO][3844] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.129/26] handle="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.930 [INFO][3844] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:17:55.971223 containerd[1481]: 2025-03-20 19:17:55.930 [INFO][3844] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.129/26] IPv6=[] ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" HandleID="k8s-pod-network.387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:55.971411 containerd[1481]: 2025-03-20 19:17:55.933 [INFO][3831] cni-plugin/k8s.go 386: Populated endpoint ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0", GenerateName:"calico-apiserver-594b5557b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"a6d2a3a2-f4d5-444a-902c-80afb149202f", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594b5557b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"calico-apiserver-594b5557b6-ljpn7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibc2bf404381", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:17:55.971483 containerd[1481]: 2025-03-20 19:17:55.934 [INFO][3831] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.129/32] ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:55.971483 containerd[1481]: 2025-03-20 19:17:55.934 [INFO][3831] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibc2bf404381 ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:55.971483 containerd[1481]: 2025-03-20 19:17:55.948 [INFO][3831] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:55.971562 containerd[1481]: 2025-03-20 19:17:55.949 [INFO][3831] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0", GenerateName:"calico-apiserver-594b5557b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"a6d2a3a2-f4d5-444a-902c-80afb149202f", ResourceVersion:"739", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594b5557b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e", Pod:"calico-apiserver-594b5557b6-ljpn7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibc2bf404381", MAC:"ce:df:e2:8b:c2:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:17:55.971627 containerd[1481]: 2025-03-20 19:17:55.967 [INFO][3831] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-ljpn7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--ljpn7-eth0" Mar 20 19:17:56.076760 containerd[1481]: time="2025-03-20T19:17:56.076704617Z" level=info msg="connecting to shim 387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e" address="unix:///run/containerd/s/2d7fa051cd85a663d81c61a9160909e905c517c089515b470d08c875db841255" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:17:56.172568 systemd[1]: Started cri-containerd-387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e.scope - libcontainer container 387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e. Mar 20 19:17:56.266820 containerd[1481]: time="2025-03-20T19:17:56.266484303Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-ljpn7,Uid:a6d2a3a2-f4d5-444a-902c-80afb149202f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e\"" Mar 20 19:17:56.291469 kernel: bpftool[4022]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 20 19:17:56.291590 containerd[1481]: time="2025-03-20T19:17:56.291525069Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 19:17:56.605829 systemd-networkd[1393]: vxlan.calico: Link UP Mar 20 19:17:56.605839 systemd-networkd[1393]: vxlan.calico: Gained carrier Mar 20 19:17:57.085536 systemd-networkd[1393]: calibc2bf404381: Gained IPv6LL Mar 20 19:17:57.982546 systemd-networkd[1393]: vxlan.calico: Gained IPv6LL Mar 20 19:17:59.726642 containerd[1481]: time="2025-03-20T19:17:59.726591828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6txdn,Uid:6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40,Namespace:kube-system,Attempt:0,}" Mar 20 19:17:59.731383 containerd[1481]: time="2025-03-20T19:17:59.730137128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-x9x55,Uid:5673035b-c7fa-4379-9bba-5327f7dcb354,Namespace:calico-apiserver,Attempt:0,}" Mar 20 19:17:59.731651 containerd[1481]: time="2025-03-20T19:17:59.731619743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98754bff4-5m8cp,Uid:357e32cb-acd2-40bd-a1ff-f4c6a737b6d0,Namespace:calico-system,Attempt:0,}" Mar 20 19:17:59.996065 systemd-networkd[1393]: cali2e89f9da3cd: Link UP Mar 20 19:17:59.998071 systemd-networkd[1393]: cali2e89f9da3cd: Gained carrier Mar 20 19:18:00.026412 containerd[1481]: 2025-03-20 19:17:59.847 [INFO][4101] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0 coredns-7db6d8ff4d- kube-system 6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40 731 0 2025-03-20 19:17:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal coredns-7db6d8ff4d-6txdn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2e89f9da3cd [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-" Mar 20 19:18:00.026412 containerd[1481]: 2025-03-20 19:17:59.847 [INFO][4101] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.026412 containerd[1481]: 2025-03-20 19:17:59.900 [INFO][4152] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" HandleID="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.916 [INFO][4152] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" HandleID="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003107c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"coredns-7db6d8ff4d-6txdn", "timestamp":"2025-03-20 19:17:59.900331918 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.916 [INFO][4152] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.916 [INFO][4152] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.917 [INFO][4152] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.927 [INFO][4152] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.941 [INFO][4152] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.950 [INFO][4152] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.956 [INFO][4152] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026671 containerd[1481]: 2025-03-20 19:17:59.959 [INFO][4152] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.959 [INFO][4152] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.961 [INFO][4152] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95 Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.967 [INFO][4152] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.976 [INFO][4152] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.130/26] block=192.168.96.128/26 handle="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.976 [INFO][4152] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.130/26] handle="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.976 [INFO][4152] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:00.026906 containerd[1481]: 2025-03-20 19:17:59.976 [INFO][4152] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.130/26] IPv6=[] ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" HandleID="k8s-pod-network.a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.027070 containerd[1481]: 2025-03-20 19:17:59.984 [INFO][4101] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-6txdn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2e89f9da3cd", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:00.027070 containerd[1481]: 2025-03-20 19:17:59.985 [INFO][4101] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.130/32] ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.027070 containerd[1481]: 2025-03-20 19:17:59.985 [INFO][4101] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2e89f9da3cd ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.027070 containerd[1481]: 2025-03-20 19:17:59.998 [INFO][4101] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.027070 containerd[1481]: 2025-03-20 19:18:00.000 [INFO][4101] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95", Pod:"coredns-7db6d8ff4d-6txdn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2e89f9da3cd", MAC:"4a:a9:1f:79:8e:3a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:00.027070 containerd[1481]: 2025-03-20 19:18:00.022 [INFO][4101] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" Namespace="kube-system" Pod="coredns-7db6d8ff4d-6txdn" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--6txdn-eth0" Mar 20 19:18:00.067402 systemd-networkd[1393]: cali38bc39d8a11: Link UP Mar 20 19:18:00.067618 systemd-networkd[1393]: cali38bc39d8a11: Gained carrier Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.837 [INFO][4122] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0 calico-apiserver-594b5557b6- calico-apiserver 5673035b-c7fa-4379-9bba-5327f7dcb354 738 0 2025-03-20 19:17:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:594b5557b6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal calico-apiserver-594b5557b6-x9x55 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali38bc39d8a11 [] []}} ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.838 [INFO][4122] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.916 [INFO][4144] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" HandleID="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.939 [INFO][4144] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" HandleID="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000266940), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"calico-apiserver-594b5557b6-x9x55", "timestamp":"2025-03-20 19:17:59.91628432 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.940 [INFO][4144] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.978 [INFO][4144] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.978 [INFO][4144] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.981 [INFO][4144] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:17:59.993 [INFO][4144] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.005 [INFO][4144] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.010 [INFO][4144] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.020 [INFO][4144] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.020 [INFO][4144] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.027 [INFO][4144] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.035 [INFO][4144] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.050 [INFO][4144] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.131/26] block=192.168.96.128/26 handle="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.050 [INFO][4144] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.131/26] handle="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.052 [INFO][4144] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:00.108806 containerd[1481]: 2025-03-20 19:18:00.052 [INFO][4144] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.131/26] IPv6=[] ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" HandleID="k8s-pod-network.588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.110482 containerd[1481]: 2025-03-20 19:18:00.059 [INFO][4122] cni-plugin/k8s.go 386: Populated endpoint ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0", GenerateName:"calico-apiserver-594b5557b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"5673035b-c7fa-4379-9bba-5327f7dcb354", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594b5557b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"calico-apiserver-594b5557b6-x9x55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali38bc39d8a11", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:00.110482 containerd[1481]: 2025-03-20 19:18:00.061 [INFO][4122] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.131/32] ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.110482 containerd[1481]: 2025-03-20 19:18:00.061 [INFO][4122] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali38bc39d8a11 ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.110482 containerd[1481]: 2025-03-20 19:18:00.067 [INFO][4122] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.110482 containerd[1481]: 2025-03-20 19:18:00.068 [INFO][4122] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0", GenerateName:"calico-apiserver-594b5557b6-", Namespace:"calico-apiserver", SelfLink:"", UID:"5673035b-c7fa-4379-9bba-5327f7dcb354", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"594b5557b6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f", Pod:"calico-apiserver-594b5557b6-x9x55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.96.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali38bc39d8a11", MAC:"22:e4:86:fb:85:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:00.110482 containerd[1481]: 2025-03-20 19:18:00.095 [INFO][4122] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" Namespace="calico-apiserver" Pod="calico-apiserver-594b5557b6-x9x55" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--apiserver--594b5557b6--x9x55-eth0" Mar 20 19:18:00.116342 containerd[1481]: time="2025-03-20T19:18:00.114050457Z" level=info msg="connecting to shim a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95" address="unix:///run/containerd/s/39aaf811af369417447f95f6086a505a949faeb3540411114bc4d7596acd3fa1" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:00.152644 systemd[1]: Started cri-containerd-a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95.scope - libcontainer container a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95. Mar 20 19:18:00.171303 systemd-networkd[1393]: cali14f8a393e55: Link UP Mar 20 19:18:00.173474 systemd-networkd[1393]: cali14f8a393e55: Gained carrier Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:17:59.826 [INFO][4102] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0 calico-kube-controllers-98754bff4- calico-system 357e32cb-acd2-40bd-a1ff-f4c6a737b6d0 740 0 2025-03-20 19:17:30 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:98754bff4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal calico-kube-controllers-98754bff4-5m8cp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali14f8a393e55 [] []}} ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:17:59.827 [INFO][4102] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:17:59.947 [INFO][4142] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:17:59.963 [INFO][4142] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004224c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"calico-kube-controllers-98754bff4-5m8cp", "timestamp":"2025-03-20 19:17:59.947851615 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:17:59.963 [INFO][4142] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.051 [INFO][4142] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.051 [INFO][4142] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.053 [INFO][4142] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.071 [INFO][4142] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.105 [INFO][4142] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.111 [INFO][4142] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.118 [INFO][4142] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.118 [INFO][4142] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.121 [INFO][4142] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00 Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.137 [INFO][4142] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.154 [INFO][4142] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.132/26] block=192.168.96.128/26 handle="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.154 [INFO][4142] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.132/26] handle="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.154 [INFO][4142] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:00.206548 containerd[1481]: 2025-03-20 19:18:00.154 [INFO][4142] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.132/26] IPv6=[] ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.207185 containerd[1481]: 2025-03-20 19:18:00.160 [INFO][4102] cni-plugin/k8s.go 386: Populated endpoint ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0", GenerateName:"calico-kube-controllers-98754bff4-", Namespace:"calico-system", SelfLink:"", UID:"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0", ResourceVersion:"740", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"98754bff4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"calico-kube-controllers-98754bff4-5m8cp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali14f8a393e55", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:00.207185 containerd[1481]: 2025-03-20 19:18:00.163 [INFO][4102] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.132/32] ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.207185 containerd[1481]: 2025-03-20 19:18:00.163 [INFO][4102] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14f8a393e55 ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.207185 containerd[1481]: 2025-03-20 19:18:00.174 [INFO][4102] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.207185 containerd[1481]: 2025-03-20 19:18:00.176 [INFO][4102] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0", GenerateName:"calico-kube-controllers-98754bff4-", Namespace:"calico-system", SelfLink:"", UID:"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0", ResourceVersion:"740", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"98754bff4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00", Pod:"calico-kube-controllers-98754bff4-5m8cp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali14f8a393e55", MAC:"4a:0d:82:46:84:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:00.207185 containerd[1481]: 2025-03-20 19:18:00.196 [INFO][4102] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Namespace="calico-system" Pod="calico-kube-controllers-98754bff4-5m8cp" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:00.224545 containerd[1481]: time="2025-03-20T19:18:00.224426455Z" level=info msg="connecting to shim 588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f" address="unix:///run/containerd/s/7769141e0feb3df2002a1bed9983155bb9b7a6e951bbd8f54e0809715d2e6da9" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:00.266759 containerd[1481]: time="2025-03-20T19:18:00.266004798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-6txdn,Uid:6257ddcf-2a54-49d8-9f21-1a5fdc8c8b40,Namespace:kube-system,Attempt:0,} returns sandbox id \"a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95\"" Mar 20 19:18:00.275036 containerd[1481]: time="2025-03-20T19:18:00.274985032Z" level=info msg="connecting to shim afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" address="unix:///run/containerd/s/a22a5c843f7e59cd3d0c50443b133738f194d5e266d3f3c50cf3d8065d78d782" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:00.282280 containerd[1481]: time="2025-03-20T19:18:00.281469440Z" level=info msg="CreateContainer within sandbox \"a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 19:18:00.284908 systemd[1]: Started cri-containerd-588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f.scope - libcontainer container 588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f. Mar 20 19:18:00.320817 containerd[1481]: time="2025-03-20T19:18:00.320682670Z" level=info msg="Container fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:00.351367 containerd[1481]: time="2025-03-20T19:18:00.351301335Z" level=info msg="CreateContainer within sandbox \"a73d98fe9f51afc1b6c9ab325aaccd740108d0d880e6ef03de7ea93f960faf95\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835\"" Mar 20 19:18:00.352946 containerd[1481]: time="2025-03-20T19:18:00.352916829Z" level=info msg="StartContainer for \"fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835\"" Mar 20 19:18:00.353797 containerd[1481]: time="2025-03-20T19:18:00.353764110Z" level=info msg="connecting to shim fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835" address="unix:///run/containerd/s/39aaf811af369417447f95f6086a505a949faeb3540411114bc4d7596acd3fa1" protocol=ttrpc version=3 Mar 20 19:18:00.354261 systemd[1]: Started cri-containerd-afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00.scope - libcontainer container afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00. Mar 20 19:18:00.401170 systemd[1]: Started cri-containerd-fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835.scope - libcontainer container fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835. Mar 20 19:18:00.491430 containerd[1481]: time="2025-03-20T19:18:00.491375821Z" level=info msg="StartContainer for \"fd44634977538639aa5667e7aa7515bb31dc8bdd62d1cd495dd6e84148879835\" returns successfully" Mar 20 19:18:00.509489 containerd[1481]: time="2025-03-20T19:18:00.509456825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-594b5557b6-x9x55,Uid:5673035b-c7fa-4379-9bba-5327f7dcb354,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f\"" Mar 20 19:18:00.565445 containerd[1481]: time="2025-03-20T19:18:00.564855493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-98754bff4-5m8cp,Uid:357e32cb-acd2-40bd-a1ff-f4c6a737b6d0,Namespace:calico-system,Attempt:0,} returns sandbox id \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\"" Mar 20 19:18:00.711870 containerd[1481]: time="2025-03-20T19:18:00.711300314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-lkvbx,Uid:5b1b32a2-c65c-4c93-a4b3-5346ab5325b0,Namespace:kube-system,Attempt:0,}" Mar 20 19:18:00.712840 containerd[1481]: time="2025-03-20T19:18:00.712808578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h7zvw,Uid:90ee6a2c-61e2-4dc7-b67c-cee29f534863,Namespace:calico-system,Attempt:0,}" Mar 20 19:18:01.014377 systemd-networkd[1393]: cali1f654755e07: Link UP Mar 20 19:18:01.014578 systemd-networkd[1393]: cali1f654755e07: Gained carrier Mar 20 19:18:01.021559 kubelet[2812]: I0320 19:18:01.020994 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-6txdn" podStartSLOduration=39.020974737 podStartE2EDuration="39.020974737s" podCreationTimestamp="2025-03-20 19:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:18:01.020851427 +0000 UTC m=+53.415673639" watchObservedRunningTime="2025-03-20 19:18:01.020974737 +0000 UTC m=+53.415796940" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.839 [INFO][4368] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0 coredns-7db6d8ff4d- kube-system 5b1b32a2-c65c-4c93-a4b3-5346ab5325b0 736 0 2025-03-20 19:17:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal coredns-7db6d8ff4d-lkvbx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1f654755e07 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.840 [INFO][4368] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.908 [INFO][4391] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" HandleID="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.935 [INFO][4391] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" HandleID="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ba670), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"coredns-7db6d8ff4d-lkvbx", "timestamp":"2025-03-20 19:18:00.908136651 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.935 [INFO][4391] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.935 [INFO][4391] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.935 [INFO][4391] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.940 [INFO][4391] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.953 [INFO][4391] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.964 [INFO][4391] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.966 [INFO][4391] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.971 [INFO][4391] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.972 [INFO][4391] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.974 [INFO][4391] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676 Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.982 [INFO][4391] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.998 [INFO][4391] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.133/26] block=192.168.96.128/26 handle="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.998 [INFO][4391] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.133/26] handle="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.999 [INFO][4391] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:01.062374 containerd[1481]: 2025-03-20 19:18:00.999 [INFO][4391] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.133/26] IPv6=[] ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" HandleID="k8s-pod-network.5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.064370 containerd[1481]: 2025-03-20 19:18:01.005 [INFO][4368] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"5b1b32a2-c65c-4c93-a4b3-5346ab5325b0", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-lkvbx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1f654755e07", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:01.064370 containerd[1481]: 2025-03-20 19:18:01.005 [INFO][4368] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.133/32] ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.064370 containerd[1481]: 2025-03-20 19:18:01.006 [INFO][4368] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f654755e07 ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.064370 containerd[1481]: 2025-03-20 19:18:01.012 [INFO][4368] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.064370 containerd[1481]: 2025-03-20 19:18:01.017 [INFO][4368] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"5b1b32a2-c65c-4c93-a4b3-5346ab5325b0", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676", Pod:"coredns-7db6d8ff4d-lkvbx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.96.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1f654755e07", MAC:"86:c2:d8:b5:06:12", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:01.064370 containerd[1481]: 2025-03-20 19:18:01.054 [INFO][4368] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" Namespace="kube-system" Pod="coredns-7db6d8ff4d-lkvbx" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-coredns--7db6d8ff4d--lkvbx-eth0" Mar 20 19:18:01.130692 systemd-networkd[1393]: cali8da9e28204a: Link UP Mar 20 19:18:01.133886 systemd-networkd[1393]: cali8da9e28204a: Gained carrier Mar 20 19:18:01.142726 containerd[1481]: time="2025-03-20T19:18:01.142670973Z" level=info msg="connecting to shim 5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676" address="unix:///run/containerd/s/24984377d6bc14dcab3bb2c833bf56059e389d5f1196e9f29b5a1f294a4fba55" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:00.863 [INFO][4374] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0 csi-node-driver- calico-system 90ee6a2c-61e2-4dc7-b67c-cee29f534863 606 0 2025-03-20 19:17:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal csi-node-driver-h7zvw eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8da9e28204a [] []}} ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:00.863 [INFO][4374] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:00.957 [INFO][4396] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" HandleID="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:00.980 [INFO][4396] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" HandleID="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ed560), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"csi-node-driver-h7zvw", "timestamp":"2025-03-20 19:18:00.95713687 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:00.981 [INFO][4396] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.001 [INFO][4396] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.001 [INFO][4396] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.005 [INFO][4396] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.036 [INFO][4396] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.058 [INFO][4396] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.065 [INFO][4396] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.079 [INFO][4396] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.080 [INFO][4396] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.083 [INFO][4396] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.092 [INFO][4396] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.103 [INFO][4396] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.134/26] block=192.168.96.128/26 handle="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.103 [INFO][4396] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.134/26] handle="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.103 [INFO][4396] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:01.166671 containerd[1481]: 2025-03-20 19:18:01.103 [INFO][4396] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.134/26] IPv6=[] ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" HandleID="k8s-pod-network.090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.167256 containerd[1481]: 2025-03-20 19:18:01.119 [INFO][4374] cni-plugin/k8s.go 386: Populated endpoint ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"90ee6a2c-61e2-4dc7-b67c-cee29f534863", ResourceVersion:"606", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"csi-node-driver-h7zvw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.96.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8da9e28204a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:01.167256 containerd[1481]: 2025-03-20 19:18:01.121 [INFO][4374] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.134/32] ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.167256 containerd[1481]: 2025-03-20 19:18:01.121 [INFO][4374] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8da9e28204a ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.167256 containerd[1481]: 2025-03-20 19:18:01.136 [INFO][4374] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.167256 containerd[1481]: 2025-03-20 19:18:01.138 [INFO][4374] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"90ee6a2c-61e2-4dc7-b67c-cee29f534863", ResourceVersion:"606", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 17, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc", Pod:"csi-node-driver-h7zvw", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.96.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8da9e28204a", MAC:"2e:cf:e2:78:84:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:01.167256 containerd[1481]: 2025-03-20 19:18:01.158 [INFO][4374] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" Namespace="calico-system" Pod="csi-node-driver-h7zvw" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-csi--node--driver--h7zvw-eth0" Mar 20 19:18:01.228561 systemd[1]: Started cri-containerd-5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676.scope - libcontainer container 5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676. Mar 20 19:18:01.238161 containerd[1481]: time="2025-03-20T19:18:01.238082067Z" level=info msg="connecting to shim 090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc" address="unix:///run/containerd/s/d03ecade11209f2ae83ae46e07ffb417bdad62348aa8c7b11456f9f75ae0d936" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:01.287602 systemd[1]: Started cri-containerd-090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc.scope - libcontainer container 090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc. Mar 20 19:18:01.331536 containerd[1481]: time="2025-03-20T19:18:01.331504727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-lkvbx,Uid:5b1b32a2-c65c-4c93-a4b3-5346ab5325b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676\"" Mar 20 19:18:01.335433 containerd[1481]: time="2025-03-20T19:18:01.335390833Z" level=info msg="CreateContainer within sandbox \"5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 20 19:18:01.351682 containerd[1481]: time="2025-03-20T19:18:01.351513833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-h7zvw,Uid:90ee6a2c-61e2-4dc7-b67c-cee29f534863,Namespace:calico-system,Attempt:0,} returns sandbox id \"090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc\"" Mar 20 19:18:01.360122 containerd[1481]: time="2025-03-20T19:18:01.360071429Z" level=info msg="Container c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:01.385944 containerd[1481]: time="2025-03-20T19:18:01.385880512Z" level=info msg="CreateContainer within sandbox \"5666088a3913bc6b33c5748b7b635fc0b73a215db833d23a4bfbdd55bec72676\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9\"" Mar 20 19:18:01.387487 containerd[1481]: time="2025-03-20T19:18:01.386469933Z" level=info msg="StartContainer for \"c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9\"" Mar 20 19:18:01.387607 containerd[1481]: time="2025-03-20T19:18:01.387541914Z" level=info msg="connecting to shim c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9" address="unix:///run/containerd/s/24984377d6bc14dcab3bb2c833bf56059e389d5f1196e9f29b5a1f294a4fba55" protocol=ttrpc version=3 Mar 20 19:18:01.411530 systemd[1]: Started cri-containerd-c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9.scope - libcontainer container c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9. Mar 20 19:18:01.412770 containerd[1481]: time="2025-03-20T19:18:01.412428165Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:01.413620 containerd[1481]: time="2025-03-20T19:18:01.413563784Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 20 19:18:01.426961 containerd[1481]: time="2025-03-20T19:18:01.426922314Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:01.430781 containerd[1481]: time="2025-03-20T19:18:01.430635295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:01.431558 containerd[1481]: time="2025-03-20T19:18:01.431509337Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 5.13994851s" Mar 20 19:18:01.431558 containerd[1481]: time="2025-03-20T19:18:01.431549312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 19:18:01.433766 containerd[1481]: time="2025-03-20T19:18:01.433742676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 20 19:18:01.437021 containerd[1481]: time="2025-03-20T19:18:01.436411288Z" level=info msg="CreateContainer within sandbox \"387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 19:18:01.453048 containerd[1481]: time="2025-03-20T19:18:01.452937321Z" level=info msg="Container 041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:01.457312 containerd[1481]: time="2025-03-20T19:18:01.457162179Z" level=info msg="StartContainer for \"c658fdac49528f949a3f752b7f1f11ef89b5b0e4585d567382a5b68d181b5ee9\" returns successfully" Mar 20 19:18:01.469862 containerd[1481]: time="2025-03-20T19:18:01.469770397Z" level=info msg="CreateContainer within sandbox \"387e5263c1edf94f5af684f2af8a62488407993c481ffe92e34c4d1605c0a76e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7\"" Mar 20 19:18:01.471259 containerd[1481]: time="2025-03-20T19:18:01.470640281Z" level=info msg="StartContainer for \"041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7\"" Mar 20 19:18:01.473171 containerd[1481]: time="2025-03-20T19:18:01.473112907Z" level=info msg="connecting to shim 041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7" address="unix:///run/containerd/s/2d7fa051cd85a663d81c61a9160909e905c517c089515b470d08c875db841255" protocol=ttrpc version=3 Mar 20 19:18:01.496502 systemd[1]: Started cri-containerd-041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7.scope - libcontainer container 041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7. Mar 20 19:18:01.502980 systemd-networkd[1393]: cali14f8a393e55: Gained IPv6LL Mar 20 19:18:01.667610 containerd[1481]: time="2025-03-20T19:18:01.666731675Z" level=info msg="StartContainer for \"041a17bcd901cc37f60c6033672b4e4fb20204af8df065837dd5a2f61edf81b7\" returns successfully" Mar 20 19:18:01.886520 systemd-networkd[1393]: cali2e89f9da3cd: Gained IPv6LL Mar 20 19:18:01.935993 containerd[1481]: time="2025-03-20T19:18:01.935824046Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:01.938415 containerd[1481]: time="2025-03-20T19:18:01.938279740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 20 19:18:01.947986 containerd[1481]: time="2025-03-20T19:18:01.947913867Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 514.096339ms" Mar 20 19:18:01.948105 containerd[1481]: time="2025-03-20T19:18:01.947986662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 20 19:18:01.950443 systemd-networkd[1393]: cali38bc39d8a11: Gained IPv6LL Mar 20 19:18:01.955919 containerd[1481]: time="2025-03-20T19:18:01.955862095Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 20 19:18:01.957202 containerd[1481]: time="2025-03-20T19:18:01.957152363Z" level=info msg="CreateContainer within sandbox \"588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 20 19:18:01.980867 containerd[1481]: time="2025-03-20T19:18:01.980799791Z" level=info msg="Container 4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:02.004103 containerd[1481]: time="2025-03-20T19:18:02.004032479Z" level=info msg="CreateContainer within sandbox \"588e6f3e076027c23ccf15d85f159ece103269abafa08471cd915f11ab5a348f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54\"" Mar 20 19:18:02.005161 containerd[1481]: time="2025-03-20T19:18:02.005117326Z" level=info msg="StartContainer for \"4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54\"" Mar 20 19:18:02.013009 containerd[1481]: time="2025-03-20T19:18:02.012971998Z" level=info msg="connecting to shim 4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54" address="unix:///run/containerd/s/7769141e0feb3df2002a1bed9983155bb9b7a6e951bbd8f54e0809715d2e6da9" protocol=ttrpc version=3 Mar 20 19:18:02.072698 systemd[1]: Started cri-containerd-4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54.scope - libcontainer container 4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54. Mar 20 19:18:02.081787 kubelet[2812]: I0320 19:18:02.081262 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-594b5557b6-ljpn7" podStartSLOduration=26.939172114 podStartE2EDuration="32.081241822s" podCreationTimestamp="2025-03-20 19:17:30 +0000 UTC" firstStartedPulling="2025-03-20 19:17:56.291139302 +0000 UTC m=+48.685961514" lastFinishedPulling="2025-03-20 19:18:01.43320902 +0000 UTC m=+53.828031222" observedRunningTime="2025-03-20 19:18:02.061593688 +0000 UTC m=+54.456415890" watchObservedRunningTime="2025-03-20 19:18:02.081241822 +0000 UTC m=+54.476064024" Mar 20 19:18:02.182005 containerd[1481]: time="2025-03-20T19:18:02.181941787Z" level=info msg="StartContainer for \"4326105f3ae70fb345bcbb882b2904b91bcfb7973f3bb9a0ab1fb14d031aee54\" returns successfully" Mar 20 19:18:02.461508 systemd-networkd[1393]: cali1f654755e07: Gained IPv6LL Mar 20 19:18:02.654494 systemd-networkd[1393]: cali8da9e28204a: Gained IPv6LL Mar 20 19:18:03.060333 kubelet[2812]: I0320 19:18:03.059222 2812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 19:18:03.089129 kubelet[2812]: I0320 19:18:03.089051 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-lkvbx" podStartSLOduration=41.089033938 podStartE2EDuration="41.089033938s" podCreationTimestamp="2025-03-20 19:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:18:02.082692913 +0000 UTC m=+54.477515125" watchObservedRunningTime="2025-03-20 19:18:03.089033938 +0000 UTC m=+55.483856150" Mar 20 19:18:03.108467 kubelet[2812]: I0320 19:18:03.107580 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-594b5557b6-x9x55" podStartSLOduration=31.667839953 podStartE2EDuration="33.10756166s" podCreationTimestamp="2025-03-20 19:17:30 +0000 UTC" firstStartedPulling="2025-03-20 19:18:00.513536136 +0000 UTC m=+52.908358348" lastFinishedPulling="2025-03-20 19:18:01.953257843 +0000 UTC m=+54.348080055" observedRunningTime="2025-03-20 19:18:03.088237399 +0000 UTC m=+55.483059601" watchObservedRunningTime="2025-03-20 19:18:03.10756166 +0000 UTC m=+55.502383862" Mar 20 19:18:06.173665 kubelet[2812]: I0320 19:18:06.173520 2812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 19:18:06.339279 containerd[1481]: time="2025-03-20T19:18:06.339182731Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:06.340769 containerd[1481]: time="2025-03-20T19:18:06.340606808Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 20 19:18:06.342143 containerd[1481]: time="2025-03-20T19:18:06.342064327Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:06.344691 containerd[1481]: time="2025-03-20T19:18:06.344605314Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:06.345390 containerd[1481]: time="2025-03-20T19:18:06.345234743Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 4.389303248s" Mar 20 19:18:06.345390 containerd[1481]: time="2025-03-20T19:18:06.345265671Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 20 19:18:06.346877 containerd[1481]: time="2025-03-20T19:18:06.346736545Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 20 19:18:06.362365 containerd[1481]: time="2025-03-20T19:18:06.360793746Z" level=info msg="CreateContainer within sandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 19:18:06.374603 containerd[1481]: time="2025-03-20T19:18:06.374569831Z" level=info msg="Container 8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:06.379624 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2813984616.mount: Deactivated successfully. Mar 20 19:18:06.410399 containerd[1481]: time="2025-03-20T19:18:06.410329658Z" level=info msg="CreateContainer within sandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\"" Mar 20 19:18:06.411200 containerd[1481]: time="2025-03-20T19:18:06.411143671Z" level=info msg="StartContainer for \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\"" Mar 20 19:18:06.412836 containerd[1481]: time="2025-03-20T19:18:06.412599157Z" level=info msg="connecting to shim 8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c" address="unix:///run/containerd/s/a22a5c843f7e59cd3d0c50443b133738f194d5e266d3f3c50cf3d8065d78d782" protocol=ttrpc version=3 Mar 20 19:18:06.438523 systemd[1]: Started cri-containerd-8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c.scope - libcontainer container 8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c. Mar 20 19:18:06.496227 containerd[1481]: time="2025-03-20T19:18:06.496190634Z" level=info msg="StartContainer for \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" returns successfully" Mar 20 19:18:07.108783 kubelet[2812]: I0320 19:18:07.108667 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-98754bff4-5m8cp" podStartSLOduration=31.330713437 podStartE2EDuration="37.108624481s" podCreationTimestamp="2025-03-20 19:17:30 +0000 UTC" firstStartedPulling="2025-03-20 19:18:00.568620096 +0000 UTC m=+52.963442308" lastFinishedPulling="2025-03-20 19:18:06.34653115 +0000 UTC m=+58.741353352" observedRunningTime="2025-03-20 19:18:07.105764303 +0000 UTC m=+59.500586605" watchObservedRunningTime="2025-03-20 19:18:07.108624481 +0000 UTC m=+59.503446743" Mar 20 19:18:08.203602 containerd[1481]: time="2025-03-20T19:18:08.203568869Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" id:\"bfd83fca802aa83cadb99528ac923356335ef539cd0e27cc013757a52d8adf56\" pid:4711 exited_at:{seconds:1742498288 nanos:202913311}" Mar 20 19:18:08.524297 update_engine[1469]: I20250320 19:18:08.523439 1469 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 20 19:18:08.524297 update_engine[1469]: I20250320 19:18:08.523545 1469 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 20 19:18:08.524297 update_engine[1469]: I20250320 19:18:08.523758 1469 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.530397 1469 omaha_request_params.cc:62] Current group set to developer Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531061 1469 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531075 1469 update_attempter.cc:643] Scheduling an action processor start. Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531090 1469 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531119 1469 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531204 1469 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531226 1469 omaha_request_action.cc:272] Request: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: Mar 20 19:18:08.531344 update_engine[1469]: I20250320 19:18:08.531234 1469 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 20 19:18:08.536682 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 20 19:18:08.548962 update_engine[1469]: I20250320 19:18:08.548894 1469 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 20 19:18:08.549309 update_engine[1469]: I20250320 19:18:08.549260 1469 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 20 19:18:08.555524 update_engine[1469]: E20250320 19:18:08.555225 1469 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 20 19:18:08.555524 update_engine[1469]: I20250320 19:18:08.555459 1469 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 20 19:18:08.910900 containerd[1481]: time="2025-03-20T19:18:08.910838832Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:08.912301 containerd[1481]: time="2025-03-20T19:18:08.912203350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 20 19:18:08.914386 containerd[1481]: time="2025-03-20T19:18:08.914297265Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:08.919404 containerd[1481]: time="2025-03-20T19:18:08.919296967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:08.921458 containerd[1481]: time="2025-03-20T19:18:08.921390239Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.574585015s" Mar 20 19:18:08.921699 containerd[1481]: time="2025-03-20T19:18:08.921430194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 20 19:18:08.929172 containerd[1481]: time="2025-03-20T19:18:08.928342381Z" level=info msg="CreateContainer within sandbox \"090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 20 19:18:08.971144 containerd[1481]: time="2025-03-20T19:18:08.971086330Z" level=info msg="Container 36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:08.991456 containerd[1481]: time="2025-03-20T19:18:08.991412027Z" level=info msg="CreateContainer within sandbox \"090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28\"" Mar 20 19:18:08.992827 containerd[1481]: time="2025-03-20T19:18:08.992796171Z" level=info msg="StartContainer for \"36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28\"" Mar 20 19:18:08.996600 containerd[1481]: time="2025-03-20T19:18:08.996559005Z" level=info msg="connecting to shim 36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28" address="unix:///run/containerd/s/d03ecade11209f2ae83ae46e07ffb417bdad62348aa8c7b11456f9f75ae0d936" protocol=ttrpc version=3 Mar 20 19:18:09.031884 systemd[1]: Started cri-containerd-36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28.scope - libcontainer container 36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28. Mar 20 19:18:09.114969 containerd[1481]: time="2025-03-20T19:18:09.114019316Z" level=info msg="StartContainer for \"36e560c9a7aa1b63b0ebfabd5fb25603dc795bb52f19389c8048d527faa61f28\" returns successfully" Mar 20 19:18:09.117550 containerd[1481]: time="2025-03-20T19:18:09.117392914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 20 19:18:11.499274 containerd[1481]: time="2025-03-20T19:18:11.499158087Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:11.500864 containerd[1481]: time="2025-03-20T19:18:11.500658002Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 20 19:18:11.503169 containerd[1481]: time="2025-03-20T19:18:11.502772772Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:11.505613 containerd[1481]: time="2025-03-20T19:18:11.505583047Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 20 19:18:11.506096 containerd[1481]: time="2025-03-20T19:18:11.506065453Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.388622335s" Mar 20 19:18:11.506145 containerd[1481]: time="2025-03-20T19:18:11.506097102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 20 19:18:11.509616 containerd[1481]: time="2025-03-20T19:18:11.509552038Z" level=info msg="CreateContainer within sandbox \"090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 20 19:18:11.522450 containerd[1481]: time="2025-03-20T19:18:11.520732745Z" level=info msg="Container 3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:11.536544 containerd[1481]: time="2025-03-20T19:18:11.536501995Z" level=info msg="CreateContainer within sandbox \"090c9c6e91807576813f375977d7120c0a46132f65a19f98082abcb1fd4ba4fc\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1\"" Mar 20 19:18:11.536933 containerd[1481]: time="2025-03-20T19:18:11.536902638Z" level=info msg="StartContainer for \"3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1\"" Mar 20 19:18:11.539429 containerd[1481]: time="2025-03-20T19:18:11.539277676Z" level=info msg="connecting to shim 3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1" address="unix:///run/containerd/s/d03ecade11209f2ae83ae46e07ffb417bdad62348aa8c7b11456f9f75ae0d936" protocol=ttrpc version=3 Mar 20 19:18:11.561486 systemd[1]: Started cri-containerd-3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1.scope - libcontainer container 3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1. Mar 20 19:18:11.711925 containerd[1481]: time="2025-03-20T19:18:11.711797424Z" level=info msg="StartContainer for \"3988a24d42c845fddef314b686458fe05b4b9001e11cc455dcc1a79ca2f1e0c1\" returns successfully" Mar 20 19:18:11.959653 kubelet[2812]: I0320 19:18:11.959585 2812 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 20 19:18:11.959653 kubelet[2812]: I0320 19:18:11.959657 2812 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 20 19:18:14.731543 containerd[1481]: time="2025-03-20T19:18:14.731439751Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" id:\"4d6f4b5cafd813c945c98900a9dde01d270cd15b4dc7c5ebff71fe519c851470\" pid:4803 exited_at:{seconds:1742498294 nanos:730085025}" Mar 20 19:18:18.529134 update_engine[1469]: I20250320 19:18:18.528388 1469 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 20 19:18:18.529134 update_engine[1469]: I20250320 19:18:18.528928 1469 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 20 19:18:18.531018 update_engine[1469]: I20250320 19:18:18.530663 1469 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 20 19:18:18.536256 update_engine[1469]: E20250320 19:18:18.536049 1469 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 20 19:18:18.536256 update_engine[1469]: I20250320 19:18:18.536171 1469 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 20 19:18:23.775834 containerd[1481]: time="2025-03-20T19:18:23.775682192Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" id:\"ab507176e36769adfc7427f8a6d84add616b7af13860e5161987ffad5343c9d0\" pid:4836 exited_at:{seconds:1742498303 nanos:775107848}" Mar 20 19:18:23.805491 kubelet[2812]: I0320 19:18:23.803792 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-h7zvw" podStartSLOduration=44.651062621 podStartE2EDuration="54.803758086s" podCreationTimestamp="2025-03-20 19:17:29 +0000 UTC" firstStartedPulling="2025-03-20 19:18:01.35449003 +0000 UTC m=+53.749312232" lastFinishedPulling="2025-03-20 19:18:11.507185485 +0000 UTC m=+63.902007697" observedRunningTime="2025-03-20 19:18:12.14474071 +0000 UTC m=+64.539562962" watchObservedRunningTime="2025-03-20 19:18:23.803758086 +0000 UTC m=+76.198580288" Mar 20 19:18:23.856732 containerd[1481]: time="2025-03-20T19:18:23.856431786Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" id:\"868ab2dac4fa10216f18c0b1ea210cef1b68846eef674be0eb0fdf5cc718183d\" pid:4863 exited_at:{seconds:1742498303 nanos:855847885}" Mar 20 19:18:25.555282 containerd[1481]: time="2025-03-20T19:18:25.554195410Z" level=info msg="StopContainer for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" with timeout 300 (s)" Mar 20 19:18:25.556567 containerd[1481]: time="2025-03-20T19:18:25.556532901Z" level=info msg="Stop container \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" with signal terminated" Mar 20 19:18:25.738922 containerd[1481]: time="2025-03-20T19:18:25.738880344Z" level=info msg="StopContainer for \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" with timeout 30 (s)" Mar 20 19:18:25.739557 containerd[1481]: time="2025-03-20T19:18:25.739418078Z" level=info msg="Stop container \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" with signal terminated" Mar 20 19:18:25.766791 systemd[1]: cri-containerd-8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c.scope: Deactivated successfully. Mar 20 19:18:25.771399 containerd[1481]: time="2025-03-20T19:18:25.771093116Z" level=info msg="received exit event container_id:\"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" id:\"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" pid:4673 exit_status:2 exited_at:{seconds:1742498305 nanos:770857272}" Mar 20 19:18:25.771399 containerd[1481]: time="2025-03-20T19:18:25.771272034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" id:\"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" pid:4673 exit_status:2 exited_at:{seconds:1742498305 nanos:770857272}" Mar 20 19:18:25.814130 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c-rootfs.mount: Deactivated successfully. Mar 20 19:18:25.837627 containerd[1481]: time="2025-03-20T19:18:25.837587475Z" level=info msg="StopContainer for \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" returns successfully" Mar 20 19:18:25.839577 containerd[1481]: time="2025-03-20T19:18:25.838894892Z" level=info msg="StopPodSandbox for \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\"" Mar 20 19:18:25.839577 containerd[1481]: time="2025-03-20T19:18:25.838969473Z" level=info msg="Container to stop \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 19:18:25.852787 systemd[1]: cri-containerd-afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00.scope: Deactivated successfully. Mar 20 19:18:25.855064 containerd[1481]: time="2025-03-20T19:18:25.854980554Z" level=info msg="TaskExit event in podsandbox handler container_id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" pid:4323 exit_status:137 exited_at:{seconds:1742498305 nanos:854504566}" Mar 20 19:18:25.880799 containerd[1481]: time="2025-03-20T19:18:25.880484879Z" level=info msg="StopContainer for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" with timeout 5 (s)" Mar 20 19:18:25.880956 containerd[1481]: time="2025-03-20T19:18:25.880901736Z" level=info msg="Stop container \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" with signal terminated" Mar 20 19:18:25.903962 containerd[1481]: time="2025-03-20T19:18:25.903518802Z" level=info msg="shim disconnected" id=afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00 namespace=k8s.io Mar 20 19:18:25.903962 containerd[1481]: time="2025-03-20T19:18:25.903667773Z" level=warning msg="cleaning up after shim disconnected" id=afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00 namespace=k8s.io Mar 20 19:18:25.903962 containerd[1481]: time="2025-03-20T19:18:25.903682911Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 19:18:25.903962 containerd[1481]: time="2025-03-20T19:18:25.903784543Z" level=error msg="Failed to handle event container_id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" pid:4323 exit_status:137 exited_at:{seconds:1742498305 nanos:854504566} for afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" error="failed to handle container TaskExit event: failed to stop sandbox: ttrpc: closed" Mar 20 19:18:25.903962 containerd[1481]: time="2025-03-20T19:18:25.903842392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" id:\"3c3dcc6144ed48e15d640e8de03c3d34185d3885597c118671cae89619dcee04\" pid:4898 exited_at:{seconds:1742498305 nanos:874608240}" Mar 20 19:18:25.903887 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00-rootfs.mount: Deactivated successfully. Mar 20 19:18:25.924249 systemd[1]: cri-containerd-aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c.scope: Deactivated successfully. Mar 20 19:18:25.925141 systemd[1]: cri-containerd-aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c.scope: Consumed 2.075s CPU time, 167M memory peak, 644K written to disk. Mar 20 19:18:25.930761 containerd[1481]: time="2025-03-20T19:18:25.930711151Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" pid:3784 exited_at:{seconds:1742498305 nanos:929942440}" Mar 20 19:18:25.931176 containerd[1481]: time="2025-03-20T19:18:25.931114662Z" level=info msg="received exit event container_id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" id:\"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" pid:3784 exited_at:{seconds:1742498305 nanos:929942440}" Mar 20 19:18:25.947104 containerd[1481]: time="2025-03-20T19:18:25.944230490Z" level=info msg="received exit event sandbox_id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" exit_status:137 exited_at:{seconds:1742498305 nanos:854504566}" Mar 20 19:18:25.950288 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00-shm.mount: Deactivated successfully. Mar 20 19:18:25.979710 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c-rootfs.mount: Deactivated successfully. Mar 20 19:18:25.996760 containerd[1481]: time="2025-03-20T19:18:25.996625555Z" level=info msg="StopContainer for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" returns successfully" Mar 20 19:18:25.997800 containerd[1481]: time="2025-03-20T19:18:25.997199669Z" level=info msg="StopPodSandbox for \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\"" Mar 20 19:18:25.997800 containerd[1481]: time="2025-03-20T19:18:25.997288026Z" level=info msg="Container to stop \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 19:18:25.997800 containerd[1481]: time="2025-03-20T19:18:25.997306110Z" level=info msg="Container to stop \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 19:18:25.997800 containerd[1481]: time="2025-03-20T19:18:25.997317791Z" level=info msg="Container to stop \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 19:18:26.008822 systemd[1]: cri-containerd-dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf.scope: Deactivated successfully. Mar 20 19:18:26.011230 containerd[1481]: time="2025-03-20T19:18:26.011153945Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" id:\"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" pid:3336 exit_status:137 exited_at:{seconds:1742498306 nanos:10495522}" Mar 20 19:18:26.050053 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf-rootfs.mount: Deactivated successfully. Mar 20 19:18:26.053967 systemd-networkd[1393]: cali14f8a393e55: Link DOWN Mar 20 19:18:26.053976 systemd-networkd[1393]: cali14f8a393e55: Lost carrier Mar 20 19:18:26.055556 containerd[1481]: time="2025-03-20T19:18:26.055195452Z" level=info msg="shim disconnected" id=dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf namespace=k8s.io Mar 20 19:18:26.055556 containerd[1481]: time="2025-03-20T19:18:26.055227131Z" level=warning msg="cleaning up after shim disconnected" id=dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf namespace=k8s.io Mar 20 19:18:26.055556 containerd[1481]: time="2025-03-20T19:18:26.055236198Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 19:18:26.092793 containerd[1481]: time="2025-03-20T19:18:26.091928894Z" level=info msg="received exit event sandbox_id:\"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" exit_status:137 exited_at:{seconds:1742498306 nanos:10495522}" Mar 20 19:18:26.093263 containerd[1481]: time="2025-03-20T19:18:26.092650106Z" level=info msg="TearDown network for sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" successfully" Mar 20 19:18:26.093314 containerd[1481]: time="2025-03-20T19:18:26.093263293Z" level=info msg="StopPodSandbox for \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" returns successfully" Mar 20 19:18:26.160776 kubelet[2812]: I0320 19:18:26.160720 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-lib-modules\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162761 kubelet[2812]: I0320 19:18:26.161829 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-node-certs\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162761 kubelet[2812]: I0320 19:18:26.161854 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-log-dir\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162761 kubelet[2812]: I0320 19:18:26.161874 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bfgp\" (UniqueName: \"kubernetes.io/projected/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-kube-api-access-7bfgp\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162761 kubelet[2812]: I0320 19:18:26.161895 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-tigera-ca-bundle\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162761 kubelet[2812]: I0320 19:18:26.161911 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-xtables-lock\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162761 kubelet[2812]: I0320 19:18:26.161932 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-bin-dir\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162976 kubelet[2812]: I0320 19:18:26.161948 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-policysync\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162976 kubelet[2812]: I0320 19:18:26.161965 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-flexvol-driver-host\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162976 kubelet[2812]: I0320 19:18:26.161981 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-net-dir\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162976 kubelet[2812]: I0320 19:18:26.161998 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-run-calico\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162976 kubelet[2812]: I0320 19:18:26.162019 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-lib-calico\") pod \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\" (UID: \"a9039d2e-8f0c-48c4-b46a-3a34401e53a3\") " Mar 20 19:18:26.162976 kubelet[2812]: I0320 19:18:26.160861 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.163145 kubelet[2812]: I0320 19:18:26.162082 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.163658 kubelet[2812]: I0320 19:18:26.163212 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.163658 kubelet[2812]: I0320 19:18:26.163245 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.168819 kubelet[2812]: I0320 19:18:26.168552 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-policysync" (OuterVolumeSpecName: "policysync") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.168819 kubelet[2812]: I0320 19:18:26.168609 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.169598 kubelet[2812]: I0320 19:18:26.169005 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.169598 kubelet[2812]: I0320 19:18:26.169040 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.169598 kubelet[2812]: I0320 19:18:26.169070 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 19:18:26.171325 kubelet[2812]: I0320 19:18:26.171302 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-node-certs" (OuterVolumeSpecName: "node-certs") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 19:18:26.171895 kubelet[2812]: I0320 19:18:26.171861 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-kube-api-access-7bfgp" (OuterVolumeSpecName: "kube-api-access-7bfgp") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "kube-api-access-7bfgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 19:18:26.179383 kubelet[2812]: I0320 19:18:26.179083 2812 topology_manager.go:215] "Topology Admit Handler" podUID="f368cdb3-2b4d-42c3-aa28-09f30e762fde" podNamespace="calico-system" podName="calico-node-79jlm" Mar 20 19:18:26.179383 kubelet[2812]: E0320 19:18:26.179148 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a9039d2e-8f0c-48c4-b46a-3a34401e53a3" containerName="calico-node" Mar 20 19:18:26.179383 kubelet[2812]: E0320 19:18:26.179161 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a9039d2e-8f0c-48c4-b46a-3a34401e53a3" containerName="flexvol-driver" Mar 20 19:18:26.179383 kubelet[2812]: E0320 19:18:26.179170 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="a9039d2e-8f0c-48c4-b46a-3a34401e53a3" containerName="install-cni" Mar 20 19:18:26.179383 kubelet[2812]: I0320 19:18:26.179201 2812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9039d2e-8f0c-48c4-b46a-3a34401e53a3" containerName="calico-node" Mar 20 19:18:26.180028 kubelet[2812]: I0320 19:18:26.180011 2812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:18:26.189854 kubelet[2812]: I0320 19:18:26.186593 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "a9039d2e-8f0c-48c4-b46a-3a34401e53a3" (UID: "a9039d2e-8f0c-48c4-b46a-3a34401e53a3"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 19:18:26.198147 systemd[1]: Created slice kubepods-besteffort-podf368cdb3_2b4d_42c3_aa28_09f30e762fde.slice - libcontainer container kubepods-besteffort-podf368cdb3_2b4d_42c3_aa28_09f30e762fde.slice. Mar 20 19:18:26.209216 kubelet[2812]: I0320 19:18:26.209176 2812 scope.go:117] "RemoveContainer" containerID="aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c" Mar 20 19:18:26.218607 systemd[1]: Removed slice kubepods-besteffort-poda9039d2e_8f0c_48c4_b46a_3a34401e53a3.slice - libcontainer container kubepods-besteffort-poda9039d2e_8f0c_48c4_b46a_3a34401e53a3.slice. Mar 20 19:18:26.219069 systemd[1]: kubepods-besteffort-poda9039d2e_8f0c_48c4_b46a_3a34401e53a3.slice: Consumed 2.731s CPU time, 298.3M memory peak, 161M written to disk. Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.048 [INFO][4995] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.048 [INFO][4995] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" iface="eth0" netns="/var/run/netns/cni-b6d8d89f-fe32-9610-1ac7-1ed1894b7afe" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.049 [INFO][4995] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" iface="eth0" netns="/var/run/netns/cni-b6d8d89f-fe32-9610-1ac7-1ed1894b7afe" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.065 [INFO][4995] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" after=16.855815ms iface="eth0" netns="/var/run/netns/cni-b6d8d89f-fe32-9610-1ac7-1ed1894b7afe" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.065 [INFO][4995] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.065 [INFO][4995] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.106 [INFO][5039] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.106 [INFO][5039] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.107 [INFO][5039] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.202 [INFO][5039] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.202 [INFO][5039] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.212 [INFO][5039] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:26.224978 containerd[1481]: 2025-03-20 19:18:26.217 [INFO][4995] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:18:26.226118 containerd[1481]: time="2025-03-20T19:18:26.225943310Z" level=info msg="TearDown network for sandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" successfully" Mar 20 19:18:26.226118 containerd[1481]: time="2025-03-20T19:18:26.225974340Z" level=info msg="StopPodSandbox for \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" returns successfully" Mar 20 19:18:26.232648 containerd[1481]: time="2025-03-20T19:18:26.232613170Z" level=info msg="RemoveContainer for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\"" Mar 20 19:18:26.249094 containerd[1481]: time="2025-03-20T19:18:26.249048831Z" level=info msg="RemoveContainer for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" returns successfully" Mar 20 19:18:26.249324 kubelet[2812]: I0320 19:18:26.249299 2812 scope.go:117] "RemoveContainer" containerID="a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5" Mar 20 19:18:26.255747 containerd[1481]: time="2025-03-20T19:18:26.255383758Z" level=info msg="RemoveContainer for \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\"" Mar 20 19:18:26.262443 kubelet[2812]: I0320 19:18:26.262276 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-var-run-calico\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262443 kubelet[2812]: I0320 19:18:26.262324 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-flexvol-driver-host\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262443 kubelet[2812]: I0320 19:18:26.262379 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-policysync\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262443 kubelet[2812]: I0320 19:18:26.262404 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jbpr\" (UniqueName: \"kubernetes.io/projected/f368cdb3-2b4d-42c3-aa28-09f30e762fde-kube-api-access-5jbpr\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262443 kubelet[2812]: I0320 19:18:26.262426 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-cni-bin-dir\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262675 kubelet[2812]: I0320 19:18:26.262447 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f368cdb3-2b4d-42c3-aa28-09f30e762fde-tigera-ca-bundle\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262675 kubelet[2812]: I0320 19:18:26.262467 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-cni-net-dir\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262675 kubelet[2812]: I0320 19:18:26.262489 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-lib-modules\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262675 kubelet[2812]: I0320 19:18:26.262509 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-var-lib-calico\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262675 kubelet[2812]: I0320 19:18:26.262531 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-cni-log-dir\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262551 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f368cdb3-2b4d-42c3-aa28-09f30e762fde-xtables-lock\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262572 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f368cdb3-2b4d-42c3-aa28-09f30e762fde-node-certs\") pod \"calico-node-79jlm\" (UID: \"f368cdb3-2b4d-42c3-aa28-09f30e762fde\") " pod="calico-system/calico-node-79jlm" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262595 2812 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-lib-calico\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262608 2812 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-node-certs\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262619 2812 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-log-dir\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262632 2812 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-7bfgp\" (UniqueName: \"kubernetes.io/projected/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-kube-api-access-7bfgp\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.262815 kubelet[2812]: I0320 19:18:26.262644 2812 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-lib-modules\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262656 2812 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-tigera-ca-bundle\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262668 2812 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-xtables-lock\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262680 2812 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-policysync\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262692 2812 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-bin-dir\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262702 2812 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-flexvol-driver-host\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262715 2812 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-cni-net-dir\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263010 kubelet[2812]: I0320 19:18:26.262726 2812 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a9039d2e-8f0c-48c4-b46a-3a34401e53a3-var-run-calico\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.263755 containerd[1481]: time="2025-03-20T19:18:26.263712789Z" level=info msg="RemoveContainer for \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" returns successfully" Mar 20 19:18:26.263971 kubelet[2812]: I0320 19:18:26.263942 2812 scope.go:117] "RemoveContainer" containerID="22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1" Mar 20 19:18:26.266913 containerd[1481]: time="2025-03-20T19:18:26.266874746Z" level=info msg="RemoveContainer for \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\"" Mar 20 19:18:26.275267 containerd[1481]: time="2025-03-20T19:18:26.275149866Z" level=info msg="RemoveContainer for \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" returns successfully" Mar 20 19:18:26.276177 kubelet[2812]: I0320 19:18:26.275962 2812 scope.go:117] "RemoveContainer" containerID="aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c" Mar 20 19:18:26.276419 containerd[1481]: time="2025-03-20T19:18:26.276304174Z" level=error msg="ContainerStatus for \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\": not found" Mar 20 19:18:26.276729 kubelet[2812]: E0320 19:18:26.276629 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\": not found" containerID="aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c" Mar 20 19:18:26.276976 kubelet[2812]: I0320 19:18:26.276667 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c"} err="failed to get container status \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\": rpc error: code = NotFound desc = an error occurred when try to find container \"aeb9de31d03b4c55a823c400192f0bcbf3dca150112da069de0312fabf27481c\": not found" Mar 20 19:18:26.276976 kubelet[2812]: I0320 19:18:26.276868 2812 scope.go:117] "RemoveContainer" containerID="a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5" Mar 20 19:18:26.277729 containerd[1481]: time="2025-03-20T19:18:26.277097702Z" level=error msg="ContainerStatus for \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\": not found" Mar 20 19:18:26.277809 kubelet[2812]: E0320 19:18:26.277514 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\": not found" containerID="a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5" Mar 20 19:18:26.277809 kubelet[2812]: I0320 19:18:26.277537 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5"} err="failed to get container status \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\": rpc error: code = NotFound desc = an error occurred when try to find container \"a4993ade774bca1b9f2834904336bfd6afbe171d784be71a8329f71491d652b5\": not found" Mar 20 19:18:26.277809 kubelet[2812]: I0320 19:18:26.277587 2812 scope.go:117] "RemoveContainer" containerID="22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1" Mar 20 19:18:26.277945 containerd[1481]: time="2025-03-20T19:18:26.277911268Z" level=error msg="ContainerStatus for \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\": not found" Mar 20 19:18:26.278266 kubelet[2812]: E0320 19:18:26.278064 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\": not found" containerID="22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1" Mar 20 19:18:26.278266 kubelet[2812]: I0320 19:18:26.278091 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1"} err="failed to get container status \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\": rpc error: code = NotFound desc = an error occurred when try to find container \"22bcafdaa14d90eed02d9633ae8ba97c4790bc111abb6449395e3319f7739ca1\": not found" Mar 20 19:18:26.363541 kubelet[2812]: I0320 19:18:26.363393 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-tigera-ca-bundle\") pod \"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0\" (UID: \"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0\") " Mar 20 19:18:26.363541 kubelet[2812]: I0320 19:18:26.363446 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhj8v\" (UniqueName: \"kubernetes.io/projected/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-kube-api-access-mhj8v\") pod \"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0\" (UID: \"357e32cb-acd2-40bd-a1ff-f4c6a737b6d0\") " Mar 20 19:18:26.371382 kubelet[2812]: I0320 19:18:26.370297 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" (UID: "357e32cb-acd2-40bd-a1ff-f4c6a737b6d0"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 19:18:26.374827 kubelet[2812]: I0320 19:18:26.374791 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-kube-api-access-mhj8v" (OuterVolumeSpecName: "kube-api-access-mhj8v") pod "357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" (UID: "357e32cb-acd2-40bd-a1ff-f4c6a737b6d0"). InnerVolumeSpecName "kube-api-access-mhj8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 19:18:26.464626 kubelet[2812]: I0320 19:18:26.464560 2812 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-tigera-ca-bundle\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.464626 kubelet[2812]: I0320 19:18:26.464622 2812 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-mhj8v\" (UniqueName: \"kubernetes.io/projected/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0-kube-api-access-mhj8v\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.509946 containerd[1481]: time="2025-03-20T19:18:26.509329540Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79jlm,Uid:f368cdb3-2b4d-42c3-aa28-09f30e762fde,Namespace:calico-system,Attempt:0,}" Mar 20 19:18:26.533402 containerd[1481]: time="2025-03-20T19:18:26.533176439Z" level=info msg="connecting to shim 0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564" address="unix:///run/containerd/s/0a38a8763d2acf7848dca5276b737c64f09a63d32a2127fa1cb3fa640a07de0a" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:26.559522 systemd[1]: Started cri-containerd-0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564.scope - libcontainer container 0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564. Mar 20 19:18:26.571230 systemd[1]: cri-containerd-bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610.scope: Deactivated successfully. Mar 20 19:18:26.575017 containerd[1481]: time="2025-03-20T19:18:26.574457225Z" level=info msg="received exit event container_id:\"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" id:\"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" pid:3381 exit_status:1 exited_at:{seconds:1742498306 nanos:573807138}" Mar 20 19:18:26.575017 containerd[1481]: time="2025-03-20T19:18:26.574523020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" id:\"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" pid:3381 exit_status:1 exited_at:{seconds:1742498306 nanos:573807138}" Mar 20 19:18:26.640508 containerd[1481]: time="2025-03-20T19:18:26.640453901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79jlm,Uid:f368cdb3-2b4d-42c3-aa28-09f30e762fde,Namespace:calico-system,Attempt:0,} returns sandbox id \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\"" Mar 20 19:18:26.643398 containerd[1481]: time="2025-03-20T19:18:26.643340950Z" level=info msg="StopContainer for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" returns successfully" Mar 20 19:18:26.644685 containerd[1481]: time="2025-03-20T19:18:26.644659890Z" level=info msg="StopPodSandbox for \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\"" Mar 20 19:18:26.644763 containerd[1481]: time="2025-03-20T19:18:26.644727437Z" level=info msg="Container to stop \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 20 19:18:26.645419 containerd[1481]: time="2025-03-20T19:18:26.645277405Z" level=info msg="CreateContainer within sandbox \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 20 19:18:26.658285 systemd[1]: cri-containerd-9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7.scope: Deactivated successfully. Mar 20 19:18:26.664183 containerd[1481]: time="2025-03-20T19:18:26.663580691Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" id:\"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" pid:3284 exit_status:137 exited_at:{seconds:1742498306 nanos:661719890}" Mar 20 19:18:26.666895 containerd[1481]: time="2025-03-20T19:18:26.666858078Z" level=info msg="Container e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:26.683040 containerd[1481]: time="2025-03-20T19:18:26.682921937Z" level=info msg="CreateContainer within sandbox \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\"" Mar 20 19:18:26.684372 containerd[1481]: time="2025-03-20T19:18:26.684178189Z" level=info msg="StartContainer for \"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\"" Mar 20 19:18:26.686503 containerd[1481]: time="2025-03-20T19:18:26.686469564Z" level=info msg="connecting to shim e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5" address="unix:///run/containerd/s/0a38a8763d2acf7848dca5276b737c64f09a63d32a2127fa1cb3fa640a07de0a" protocol=ttrpc version=3 Mar 20 19:18:26.711775 containerd[1481]: time="2025-03-20T19:18:26.711732256Z" level=info msg="shim disconnected" id=9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7 namespace=k8s.io Mar 20 19:18:26.712358 containerd[1481]: time="2025-03-20T19:18:26.712247879Z" level=warning msg="cleaning up after shim disconnected" id=9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7 namespace=k8s.io Mar 20 19:18:26.712358 containerd[1481]: time="2025-03-20T19:18:26.712264971Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 20 19:18:26.743675 systemd[1]: Started cri-containerd-e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5.scope - libcontainer container e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5. Mar 20 19:18:26.755856 containerd[1481]: time="2025-03-20T19:18:26.755815461Z" level=info msg="received exit event sandbox_id:\"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" exit_status:137 exited_at:{seconds:1742498306 nanos:661719890}" Mar 20 19:18:26.759009 containerd[1481]: time="2025-03-20T19:18:26.758660530Z" level=info msg="TearDown network for sandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" successfully" Mar 20 19:18:26.759009 containerd[1481]: time="2025-03-20T19:18:26.758686459Z" level=info msg="StopPodSandbox for \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" returns successfully" Mar 20 19:18:26.831735 systemd[1]: var-lib-kubelet-pods-357e32cb\x2dacd2\x2d40bd\x2da1ff\x2df4c6a737b6d0-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 20 19:18:26.832581 systemd[1]: run-netns-cni\x2db6d8d89f\x2dfe32\x2d9610\x2d1ac7\x2d1ed1894b7afe.mount: Deactivated successfully. Mar 20 19:18:26.832995 systemd[1]: var-lib-kubelet-pods-a9039d2e\x2d8f0c\x2d48c4\x2db46a\x2d3a34401e53a3-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 20 19:18:26.833871 systemd[1]: var-lib-kubelet-pods-357e32cb\x2dacd2\x2d40bd\x2da1ff\x2df4c6a737b6d0-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmhj8v.mount: Deactivated successfully. Mar 20 19:18:26.835461 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610-rootfs.mount: Deactivated successfully. Mar 20 19:18:26.835539 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf-shm.mount: Deactivated successfully. Mar 20 19:18:26.835612 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7-rootfs.mount: Deactivated successfully. Mar 20 19:18:26.835697 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7-shm.mount: Deactivated successfully. Mar 20 19:18:26.835769 systemd[1]: var-lib-kubelet-pods-a9039d2e\x2d8f0c\x2d48c4\x2db46a\x2d3a34401e53a3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7bfgp.mount: Deactivated successfully. Mar 20 19:18:26.835855 systemd[1]: var-lib-kubelet-pods-a9039d2e\x2d8f0c\x2d48c4\x2db46a\x2d3a34401e53a3-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 20 19:18:26.867060 kubelet[2812]: I0320 19:18:26.867027 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4e2c434f-3f0f-44eb-9b02-1e9062595f29-typha-certs\") pod \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\" (UID: \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\") " Mar 20 19:18:26.867184 kubelet[2812]: I0320 19:18:26.867071 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spllw\" (UniqueName: \"kubernetes.io/projected/4e2c434f-3f0f-44eb-9b02-1e9062595f29-kube-api-access-spllw\") pod \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\" (UID: \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\") " Mar 20 19:18:26.867184 kubelet[2812]: I0320 19:18:26.867102 2812 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2c434f-3f0f-44eb-9b02-1e9062595f29-tigera-ca-bundle\") pod \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\" (UID: \"4e2c434f-3f0f-44eb-9b02-1e9062595f29\") " Mar 20 19:18:26.880512 kubelet[2812]: I0320 19:18:26.874981 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e2c434f-3f0f-44eb-9b02-1e9062595f29-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "4e2c434f-3f0f-44eb-9b02-1e9062595f29" (UID: "4e2c434f-3f0f-44eb-9b02-1e9062595f29"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 19:18:26.878214 systemd[1]: var-lib-kubelet-pods-4e2c434f\x2d3f0f\x2d44eb\x2d9b02\x2d1e9062595f29-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 20 19:18:26.887384 kubelet[2812]: I0320 19:18:26.886342 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2c434f-3f0f-44eb-9b02-1e9062595f29-kube-api-access-spllw" (OuterVolumeSpecName: "kube-api-access-spllw") pod "4e2c434f-3f0f-44eb-9b02-1e9062595f29" (UID: "4e2c434f-3f0f-44eb-9b02-1e9062595f29"). InnerVolumeSpecName "kube-api-access-spllw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 19:18:26.886845 systemd[1]: var-lib-kubelet-pods-4e2c434f\x2d3f0f\x2d44eb\x2d9b02\x2d1e9062595f29-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 20 19:18:26.886958 systemd[1]: var-lib-kubelet-pods-4e2c434f\x2d3f0f\x2d44eb\x2d9b02\x2d1e9062595f29-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dspllw.mount: Deactivated successfully. Mar 20 19:18:26.890500 kubelet[2812]: I0320 19:18:26.890020 2812 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e2c434f-3f0f-44eb-9b02-1e9062595f29-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "4e2c434f-3f0f-44eb-9b02-1e9062595f29" (UID: "4e2c434f-3f0f-44eb-9b02-1e9062595f29"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 19:18:26.915267 containerd[1481]: time="2025-03-20T19:18:26.915048766Z" level=info msg="StartContainer for \"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\" returns successfully" Mar 20 19:18:26.931839 systemd[1]: cri-containerd-e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5.scope: Deactivated successfully. Mar 20 19:18:26.932302 systemd[1]: cri-containerd-e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5.scope: Consumed 40ms CPU time, 7.9M memory peak, 6.3M written to disk. Mar 20 19:18:26.935078 containerd[1481]: time="2025-03-20T19:18:26.935039729Z" level=info msg="received exit event container_id:\"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\" id:\"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\" pid:5166 exited_at:{seconds:1742498306 nanos:934705698}" Mar 20 19:18:26.935597 containerd[1481]: time="2025-03-20T19:18:26.935412802Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\" id:\"e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5\" pid:5166 exited_at:{seconds:1742498306 nanos:934705698}" Mar 20 19:18:26.964188 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e7a377a645708c60c76fa7cd058264b968ea111b7ab93a74a2d5f2373e9647e5-rootfs.mount: Deactivated successfully. Mar 20 19:18:26.968622 kubelet[2812]: I0320 19:18:26.968552 2812 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4e2c434f-3f0f-44eb-9b02-1e9062595f29-typha-certs\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.968622 kubelet[2812]: I0320 19:18:26.968584 2812 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-spllw\" (UniqueName: \"kubernetes.io/projected/4e2c434f-3f0f-44eb-9b02-1e9062595f29-kube-api-access-spllw\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:26.968622 kubelet[2812]: I0320 19:18:26.968597 2812 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e2c434f-3f0f-44eb-9b02-1e9062595f29-tigera-ca-bundle\") on node \"ci-9999-0-1-1-f6fba67404.novalocal\" DevicePath \"\"" Mar 20 19:18:27.015623 containerd[1481]: time="2025-03-20T19:18:27.015509289Z" level=info msg="TaskExit event in podsandbox handler container_id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" id:\"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" pid:4323 exit_status:137 exited_at:{seconds:1742498305 nanos:854504566}" Mar 20 19:18:27.224565 containerd[1481]: time="2025-03-20T19:18:27.222448739Z" level=info msg="CreateContainer within sandbox \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 20 19:18:27.230703 kubelet[2812]: I0320 19:18:27.230664 2812 scope.go:117] "RemoveContainer" containerID="bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610" Mar 20 19:18:27.238384 containerd[1481]: time="2025-03-20T19:18:27.236903658Z" level=info msg="RemoveContainer for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\"" Mar 20 19:18:27.248968 containerd[1481]: time="2025-03-20T19:18:27.248922480Z" level=info msg="Container 08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:27.250949 containerd[1481]: time="2025-03-20T19:18:27.249126906Z" level=info msg="RemoveContainer for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" returns successfully" Mar 20 19:18:27.251854 kubelet[2812]: I0320 19:18:27.251824 2812 scope.go:117] "RemoveContainer" containerID="bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610" Mar 20 19:18:27.252335 containerd[1481]: time="2025-03-20T19:18:27.252300508Z" level=error msg="ContainerStatus for \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\": not found" Mar 20 19:18:27.252614 kubelet[2812]: E0320 19:18:27.252585 2812 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\": not found" containerID="bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610" Mar 20 19:18:27.252822 kubelet[2812]: I0320 19:18:27.252800 2812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610"} err="failed to get container status \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\": rpc error: code = NotFound desc = an error occurred when try to find container \"bc688b699dbf262fbb1c2ce80118b971446896f8551df7e5df22c3de5f9de610\": not found" Mar 20 19:18:27.258574 systemd[1]: Removed slice kubepods-besteffort-pod4e2c434f_3f0f_44eb_9b02_1e9062595f29.slice - libcontainer container kubepods-besteffort-pod4e2c434f_3f0f_44eb_9b02_1e9062595f29.slice. Mar 20 19:18:27.263250 systemd[1]: Removed slice kubepods-besteffort-pod357e32cb_acd2_40bd_a1ff_f4c6a737b6d0.slice - libcontainer container kubepods-besteffort-pod357e32cb_acd2_40bd_a1ff_f4c6a737b6d0.slice. Mar 20 19:18:27.265894 containerd[1481]: time="2025-03-20T19:18:27.265708672Z" level=info msg="CreateContainer within sandbox \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\"" Mar 20 19:18:27.267640 containerd[1481]: time="2025-03-20T19:18:27.266763073Z" level=info msg="StartContainer for \"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\"" Mar 20 19:18:27.269274 containerd[1481]: time="2025-03-20T19:18:27.269250028Z" level=info msg="connecting to shim 08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4" address="unix:///run/containerd/s/0a38a8763d2acf7848dca5276b737c64f09a63d32a2127fa1cb3fa640a07de0a" protocol=ttrpc version=3 Mar 20 19:18:27.302842 systemd[1]: Started cri-containerd-08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4.scope - libcontainer container 08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4. Mar 20 19:18:27.390686 containerd[1481]: time="2025-03-20T19:18:27.390477784Z" level=info msg="StartContainer for \"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\" returns successfully" Mar 20 19:18:27.411478 kubelet[2812]: I0320 19:18:27.410305 2812 topology_manager.go:215] "Topology Admit Handler" podUID="bff07c95-f4c3-4f6d-ad0d-d4243ef08cca" podNamespace="calico-system" podName="calico-typha-5457f9ccb7-t452h" Mar 20 19:18:27.411478 kubelet[2812]: E0320 19:18:27.410389 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="4e2c434f-3f0f-44eb-9b02-1e9062595f29" containerName="calico-typha" Mar 20 19:18:27.411478 kubelet[2812]: E0320 19:18:27.410402 2812 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" containerName="calico-kube-controllers" Mar 20 19:18:27.411478 kubelet[2812]: I0320 19:18:27.410430 2812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e2c434f-3f0f-44eb-9b02-1e9062595f29" containerName="calico-typha" Mar 20 19:18:27.411478 kubelet[2812]: I0320 19:18:27.410438 2812 memory_manager.go:354] "RemoveStaleState removing state" podUID="357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" containerName="calico-kube-controllers" Mar 20 19:18:27.416149 kubelet[2812]: W0320 19:18:27.416108 2812 reflector.go:547] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-9999-0-1-1-f6fba67404.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-1-1-f6fba67404.novalocal' and this object Mar 20 19:18:27.416272 kubelet[2812]: E0320 19:18:27.416171 2812 reflector.go:150] object-"calico-system"/"typha-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-9999-0-1-1-f6fba67404.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-1-1-f6fba67404.novalocal' and this object Mar 20 19:18:27.423450 systemd[1]: Created slice kubepods-besteffort-podbff07c95_f4c3_4f6d_ad0d_d4243ef08cca.slice - libcontainer container kubepods-besteffort-podbff07c95_f4c3_4f6d_ad0d_d4243ef08cca.slice. Mar 20 19:18:27.472477 kubelet[2812]: I0320 19:18:27.472439 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bff07c95-f4c3-4f6d-ad0d-d4243ef08cca-tigera-ca-bundle\") pod \"calico-typha-5457f9ccb7-t452h\" (UID: \"bff07c95-f4c3-4f6d-ad0d-d4243ef08cca\") " pod="calico-system/calico-typha-5457f9ccb7-t452h" Mar 20 19:18:27.472477 kubelet[2812]: I0320 19:18:27.472489 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bff07c95-f4c3-4f6d-ad0d-d4243ef08cca-typha-certs\") pod \"calico-typha-5457f9ccb7-t452h\" (UID: \"bff07c95-f4c3-4f6d-ad0d-d4243ef08cca\") " pod="calico-system/calico-typha-5457f9ccb7-t452h" Mar 20 19:18:27.472654 kubelet[2812]: I0320 19:18:27.472515 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5dwq\" (UniqueName: \"kubernetes.io/projected/bff07c95-f4c3-4f6d-ad0d-d4243ef08cca-kube-api-access-z5dwq\") pod \"calico-typha-5457f9ccb7-t452h\" (UID: \"bff07c95-f4c3-4f6d-ad0d-d4243ef08cca\") " pod="calico-system/calico-typha-5457f9ccb7-t452h" Mar 20 19:18:27.715495 kubelet[2812]: I0320 19:18:27.715414 2812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="357e32cb-acd2-40bd-a1ff-f4c6a737b6d0" path="/var/lib/kubelet/pods/357e32cb-acd2-40bd-a1ff-f4c6a737b6d0/volumes" Mar 20 19:18:27.716015 kubelet[2812]: I0320 19:18:27.715992 2812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2c434f-3f0f-44eb-9b02-1e9062595f29" path="/var/lib/kubelet/pods/4e2c434f-3f0f-44eb-9b02-1e9062595f29/volumes" Mar 20 19:18:27.716554 kubelet[2812]: I0320 19:18:27.716529 2812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9039d2e-8f0c-48c4-b46a-3a34401e53a3" path="/var/lib/kubelet/pods/a9039d2e-8f0c-48c4-b46a-3a34401e53a3/volumes" Mar 20 19:18:28.522458 update_engine[1469]: I20250320 19:18:28.522395 1469 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 20 19:18:28.522811 update_engine[1469]: I20250320 19:18:28.522614 1469 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 20 19:18:28.522877 update_engine[1469]: I20250320 19:18:28.522850 1469 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 20 19:18:28.528241 update_engine[1469]: E20250320 19:18:28.528184 1469 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 20 19:18:28.528383 update_engine[1469]: I20250320 19:18:28.528270 1469 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 20 19:18:28.612140 systemd[1]: cri-containerd-08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4.scope: Deactivated successfully. Mar 20 19:18:28.612510 systemd[1]: cri-containerd-08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4.scope: Consumed 767ms CPU time, 49.6M memory peak, 34.1M read from disk. Mar 20 19:18:28.613242 containerd[1481]: time="2025-03-20T19:18:28.613050549Z" level=info msg="TaskExit event in podsandbox handler container_id:\"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\" id:\"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\" pid:5221 exited_at:{seconds:1742498308 nanos:612726497}" Mar 20 19:18:28.613242 containerd[1481]: time="2025-03-20T19:18:28.613155057Z" level=info msg="received exit event container_id:\"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\" id:\"08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4\" pid:5221 exited_at:{seconds:1742498308 nanos:612726497}" Mar 20 19:18:28.629933 containerd[1481]: time="2025-03-20T19:18:28.629893457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5457f9ccb7-t452h,Uid:bff07c95-f4c3-4f6d-ad0d-d4243ef08cca,Namespace:calico-system,Attempt:0,}" Mar 20 19:18:28.644717 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-08bab2a71217201a0628fd63d5ef99288184c4c6840b7c3e9392fafca8af4fb4-rootfs.mount: Deactivated successfully. Mar 20 19:18:28.679057 containerd[1481]: time="2025-03-20T19:18:28.679012738Z" level=info msg="connecting to shim 485debcc9db8c7fc47b5851cfac50a2211880d6eabbb65f6f8700dce1bab9b26" address="unix:///run/containerd/s/d092ab6b6bbd7ad11e76c59389886f28f3e096e563754881e04f662b1e97c72f" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:28.710510 systemd[1]: Started cri-containerd-485debcc9db8c7fc47b5851cfac50a2211880d6eabbb65f6f8700dce1bab9b26.scope - libcontainer container 485debcc9db8c7fc47b5851cfac50a2211880d6eabbb65f6f8700dce1bab9b26. Mar 20 19:18:28.770306 containerd[1481]: time="2025-03-20T19:18:28.770265830Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5457f9ccb7-t452h,Uid:bff07c95-f4c3-4f6d-ad0d-d4243ef08cca,Namespace:calico-system,Attempt:0,} returns sandbox id \"485debcc9db8c7fc47b5851cfac50a2211880d6eabbb65f6f8700dce1bab9b26\"" Mar 20 19:18:28.781948 containerd[1481]: time="2025-03-20T19:18:28.781833044Z" level=info msg="CreateContainer within sandbox \"485debcc9db8c7fc47b5851cfac50a2211880d6eabbb65f6f8700dce1bab9b26\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 20 19:18:28.795423 containerd[1481]: time="2025-03-20T19:18:28.795380830Z" level=info msg="Container 4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:28.815754 containerd[1481]: time="2025-03-20T19:18:28.814589836Z" level=info msg="CreateContainer within sandbox \"485debcc9db8c7fc47b5851cfac50a2211880d6eabbb65f6f8700dce1bab9b26\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2\"" Mar 20 19:18:28.816492 containerd[1481]: time="2025-03-20T19:18:28.816232718Z" level=info msg="StartContainer for \"4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2\"" Mar 20 19:18:28.822495 containerd[1481]: time="2025-03-20T19:18:28.822021691Z" level=info msg="connecting to shim 4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2" address="unix:///run/containerd/s/d092ab6b6bbd7ad11e76c59389886f28f3e096e563754881e04f662b1e97c72f" protocol=ttrpc version=3 Mar 20 19:18:28.858193 systemd[1]: Started cri-containerd-4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2.scope - libcontainer container 4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2. Mar 20 19:18:28.930567 containerd[1481]: time="2025-03-20T19:18:28.930527474Z" level=info msg="StartContainer for \"4d2eaac0aaccbab6dd4170864d9c1103c8ba1e25bf517a6479b4f24bba1392d2\" returns successfully" Mar 20 19:18:29.299896 containerd[1481]: time="2025-03-20T19:18:29.299813017Z" level=info msg="CreateContainer within sandbox \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 20 19:18:29.319079 containerd[1481]: time="2025-03-20T19:18:29.317926444Z" level=info msg="Container 5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:29.341913 containerd[1481]: time="2025-03-20T19:18:29.341637915Z" level=info msg="CreateContainer within sandbox \"0c92ffd611511985573a49927dff246f420d73687dab1b343db9be60bc985564\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\"" Mar 20 19:18:29.344384 kubelet[2812]: I0320 19:18:29.342750 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5457f9ccb7-t452h" podStartSLOduration=4.342734396 podStartE2EDuration="4.342734396s" podCreationTimestamp="2025-03-20 19:18:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:18:29.340911933 +0000 UTC m=+81.735734145" watchObservedRunningTime="2025-03-20 19:18:29.342734396 +0000 UTC m=+81.737556598" Mar 20 19:18:29.345019 containerd[1481]: time="2025-03-20T19:18:29.344984386Z" level=info msg="StartContainer for \"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\"" Mar 20 19:18:29.348771 containerd[1481]: time="2025-03-20T19:18:29.348729471Z" level=info msg="connecting to shim 5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c" address="unix:///run/containerd/s/0a38a8763d2acf7848dca5276b737c64f09a63d32a2127fa1cb3fa640a07de0a" protocol=ttrpc version=3 Mar 20 19:18:29.382415 systemd[1]: Started cri-containerd-5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c.scope - libcontainer container 5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c. Mar 20 19:18:29.455157 containerd[1481]: time="2025-03-20T19:18:29.455117347Z" level=info msg="StartContainer for \"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" returns successfully" Mar 20 19:18:29.580027 kubelet[2812]: I0320 19:18:29.579909 2812 topology_manager.go:215] "Topology Admit Handler" podUID="3396610e-b805-4c60-9fbd-43cfd2c867c7" podNamespace="calico-system" podName="calico-kube-controllers-f9fdd584f-m7mq7" Mar 20 19:18:29.593950 systemd[1]: Created slice kubepods-besteffort-pod3396610e_b805_4c60_9fbd_43cfd2c867c7.slice - libcontainer container kubepods-besteffort-pod3396610e_b805_4c60_9fbd_43cfd2c867c7.slice. Mar 20 19:18:29.686425 kubelet[2812]: I0320 19:18:29.686333 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qmft\" (UniqueName: \"kubernetes.io/projected/3396610e-b805-4c60-9fbd-43cfd2c867c7-kube-api-access-8qmft\") pod \"calico-kube-controllers-f9fdd584f-m7mq7\" (UID: \"3396610e-b805-4c60-9fbd-43cfd2c867c7\") " pod="calico-system/calico-kube-controllers-f9fdd584f-m7mq7" Mar 20 19:18:29.686425 kubelet[2812]: I0320 19:18:29.686400 2812 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3396610e-b805-4c60-9fbd-43cfd2c867c7-tigera-ca-bundle\") pod \"calico-kube-controllers-f9fdd584f-m7mq7\" (UID: \"3396610e-b805-4c60-9fbd-43cfd2c867c7\") " pod="calico-system/calico-kube-controllers-f9fdd584f-m7mq7" Mar 20 19:18:29.898368 containerd[1481]: time="2025-03-20T19:18:29.898313123Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-f9fdd584f-m7mq7,Uid:3396610e-b805-4c60-9fbd-43cfd2c867c7,Namespace:calico-system,Attempt:0,}" Mar 20 19:18:30.059466 systemd-networkd[1393]: cali6fdffd26a7f: Link UP Mar 20 19:18:30.059674 systemd-networkd[1393]: cali6fdffd26a7f: Gained carrier Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:29.951 [INFO][5391] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0 calico-kube-controllers-f9fdd584f- calico-system 3396610e-b805-4c60-9fbd-43cfd2c867c7 1070 0 2025-03-20 19:18:27 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:f9fdd584f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-1-1-f6fba67404.novalocal calico-kube-controllers-f9fdd584f-m7mq7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6fdffd26a7f [] []}} ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:29.951 [INFO][5391] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:29.996 [INFO][5403] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" HandleID="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.008 [INFO][5403] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" HandleID="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d2960), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-1-1-f6fba67404.novalocal", "pod":"calico-kube-controllers-f9fdd584f-m7mq7", "timestamp":"2025-03-20 19:18:29.99606213 +0000 UTC"}, Hostname:"ci-9999-0-1-1-f6fba67404.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.008 [INFO][5403] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.008 [INFO][5403] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.008 [INFO][5403] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-1-1-f6fba67404.novalocal' Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.010 [INFO][5403] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.017 [INFO][5403] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.028 [INFO][5403] ipam/ipam.go 489: Trying affinity for 192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.031 [INFO][5403] ipam/ipam.go 155: Attempting to load block cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.035 [INFO][5403] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.96.128/26 host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.035 [INFO][5403] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.96.128/26 handle="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.037 [INFO][5403] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.043 [INFO][5403] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.96.128/26 handle="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.051 [INFO][5403] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.96.135/26] block=192.168.96.128/26 handle="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.051 [INFO][5403] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.96.135/26] handle="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" host="ci-9999-0-1-1-f6fba67404.novalocal" Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.051 [INFO][5403] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:18:30.082297 containerd[1481]: 2025-03-20 19:18:30.051 [INFO][5403] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.96.135/26] IPv6=[] ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" HandleID="k8s-pod-network.ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.084008 containerd[1481]: 2025-03-20 19:18:30.053 [INFO][5391] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0", GenerateName:"calico-kube-controllers-f9fdd584f-", Namespace:"calico-system", SelfLink:"", UID:"3396610e-b805-4c60-9fbd-43cfd2c867c7", ResourceVersion:"1070", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"f9fdd584f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"", Pod:"calico-kube-controllers-f9fdd584f-m7mq7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6fdffd26a7f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:30.084008 containerd[1481]: 2025-03-20 19:18:30.054 [INFO][5391] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.96.135/32] ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.084008 containerd[1481]: 2025-03-20 19:18:30.054 [INFO][5391] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6fdffd26a7f ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.084008 containerd[1481]: 2025-03-20 19:18:30.059 [INFO][5391] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.084008 containerd[1481]: 2025-03-20 19:18:30.060 [INFO][5391] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0", GenerateName:"calico-kube-controllers-f9fdd584f-", Namespace:"calico-system", SelfLink:"", UID:"3396610e-b805-4c60-9fbd-43cfd2c867c7", ResourceVersion:"1070", Generation:0, CreationTimestamp:time.Date(2025, time.March, 20, 19, 18, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"f9fdd584f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-1-1-f6fba67404.novalocal", ContainerID:"ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c", Pod:"calico-kube-controllers-f9fdd584f-m7mq7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.96.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6fdffd26a7f", MAC:"1a:8c:5d:bc:8e:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 20 19:18:30.084008 containerd[1481]: 2025-03-20 19:18:30.080 [INFO][5391] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" Namespace="calico-system" Pod="calico-kube-controllers-f9fdd584f-m7mq7" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--f9fdd584f--m7mq7-eth0" Mar 20 19:18:30.117996 containerd[1481]: time="2025-03-20T19:18:30.117954771Z" level=info msg="connecting to shim ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c" address="unix:///run/containerd/s/6cdd9b22670c9d1f6e251918abcceccb4e712817d8e3579b5ee2b30f72499c68" namespace=k8s.io protocol=ttrpc version=3 Mar 20 19:18:30.157507 systemd[1]: Started cri-containerd-ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c.scope - libcontainer container ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c. Mar 20 19:18:30.216184 containerd[1481]: time="2025-03-20T19:18:30.216132406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-f9fdd584f-m7mq7,Uid:3396610e-b805-4c60-9fbd-43cfd2c867c7,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c\"" Mar 20 19:18:30.235560 containerd[1481]: time="2025-03-20T19:18:30.234857867Z" level=info msg="CreateContainer within sandbox \"ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 20 19:18:30.249114 containerd[1481]: time="2025-03-20T19:18:30.249053810Z" level=info msg="Container 7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959: CDI devices from CRI Config.CDIDevices: []" Mar 20 19:18:30.259629 containerd[1481]: time="2025-03-20T19:18:30.259592663Z" level=info msg="CreateContainer within sandbox \"ce0b802d5c882423eb819828c391a8f8f9afa9d2db0b0c2d6f70906932239a3c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\"" Mar 20 19:18:30.260389 containerd[1481]: time="2025-03-20T19:18:30.260229557Z" level=info msg="StartContainer for \"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\"" Mar 20 19:18:30.261980 containerd[1481]: time="2025-03-20T19:18:30.261948665Z" level=info msg="connecting to shim 7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959" address="unix:///run/containerd/s/6cdd9b22670c9d1f6e251918abcceccb4e712817d8e3579b5ee2b30f72499c68" protocol=ttrpc version=3 Mar 20 19:18:30.283564 systemd[1]: Started cri-containerd-7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959.scope - libcontainer container 7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959. Mar 20 19:18:30.366705 containerd[1481]: time="2025-03-20T19:18:30.366673110Z" level=info msg="StartContainer for \"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" returns successfully" Mar 20 19:18:31.354786 kubelet[2812]: I0320 19:18:31.354727 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-79jlm" podStartSLOduration=5.354712435 podStartE2EDuration="5.354712435s" podCreationTimestamp="2025-03-20 19:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:18:30.324514141 +0000 UTC m=+82.719336353" watchObservedRunningTime="2025-03-20 19:18:31.354712435 +0000 UTC m=+83.749534637" Mar 20 19:18:31.355189 kubelet[2812]: I0320 19:18:31.355148 2812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-f9fdd584f-m7mq7" podStartSLOduration=4.355142688 podStartE2EDuration="4.355142688s" podCreationTimestamp="2025-03-20 19:18:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-20 19:18:31.354659605 +0000 UTC m=+83.749481807" watchObservedRunningTime="2025-03-20 19:18:31.355142688 +0000 UTC m=+83.749964900" Mar 20 19:18:31.465185 containerd[1481]: time="2025-03-20T19:18:31.464760119Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"2cb7abf58565c24c7a532665c34a404165f3a778cc91abf0787525b0b8ea5909\" pid:5636 exit_status:1 exited_at:{seconds:1742498311 nanos:464009860}" Mar 20 19:18:31.901510 systemd-networkd[1393]: cali6fdffd26a7f: Gained IPv6LL Mar 20 19:18:32.367842 containerd[1481]: time="2025-03-20T19:18:32.367757941Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"5a9f398a331e9c2e678506f0b71603cd4e97c720ceb8b4ec92317789b858a97c\" pid:5726 exit_status:1 exited_at:{seconds:1742498312 nanos:367227738}" Mar 20 19:18:38.530227 update_engine[1469]: I20250320 19:18:38.529479 1469 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 20 19:18:38.530227 update_engine[1469]: I20250320 19:18:38.529906 1469 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 20 19:18:38.531772 update_engine[1469]: I20250320 19:18:38.531659 1469 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 20 19:18:38.537468 update_engine[1469]: E20250320 19:18:38.537038 1469 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 20 19:18:38.537468 update_engine[1469]: I20250320 19:18:38.537169 1469 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 20 19:18:38.537468 update_engine[1469]: I20250320 19:18:38.537190 1469 omaha_request_action.cc:617] Omaha request response: Mar 20 19:18:38.537468 update_engine[1469]: E20250320 19:18:38.537309 1469 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537495 1469 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537528 1469 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537540 1469 update_attempter.cc:306] Processing Done. Mar 20 19:18:38.537940 update_engine[1469]: E20250320 19:18:38.537562 1469 update_attempter.cc:619] Update failed. Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537576 1469 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537588 1469 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537599 1469 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537737 1469 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537782 1469 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537794 1469 omaha_request_action.cc:272] Request: Mar 20 19:18:38.537940 update_engine[1469]: Mar 20 19:18:38.537940 update_engine[1469]: Mar 20 19:18:38.537940 update_engine[1469]: Mar 20 19:18:38.537940 update_engine[1469]: Mar 20 19:18:38.537940 update_engine[1469]: Mar 20 19:18:38.537940 update_engine[1469]: Mar 20 19:18:38.537940 update_engine[1469]: I20250320 19:18:38.537807 1469 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 20 19:18:38.539684 update_engine[1469]: I20250320 19:18:38.538195 1469 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 20 19:18:38.539684 update_engine[1469]: I20250320 19:18:38.538604 1469 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 20 19:18:38.539902 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 20 19:18:38.543969 update_engine[1469]: E20250320 19:18:38.543807 1469 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544060 1469 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544088 1469 omaha_request_action.cc:617] Omaha request response: Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544101 1469 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544113 1469 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544126 1469 update_attempter.cc:306] Processing Done. Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544138 1469 update_attempter.cc:310] Error event sent. Mar 20 19:18:38.544168 update_engine[1469]: I20250320 19:18:38.544159 1469 update_check_scheduler.cc:74] Next update check in 43m9s Mar 20 19:18:38.545229 locksmithd[1488]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 20 19:18:56.624418 containerd[1481]: time="2025-03-20T19:18:56.624202132Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"03c49b684baf8da179cd8224d7ad6c85298274c0d179d9b7d59a557c302464e4\" pid:5777 exited_at:{seconds:1742498336 nanos:623724046}" Mar 20 19:18:56.736485 containerd[1481]: time="2025-03-20T19:18:56.735129539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"c6db6d23be89ffa440d5fd347b4538738f6dcb9fc0be184c14d2c9de24070b80\" pid:5802 exited_at:{seconds:1742498336 nanos:734772622}" Mar 20 19:18:59.989742 containerd[1481]: time="2025-03-20T19:18:59.989705864Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"ca4b880f72c181cfa46b74314e050eede3c79b556b2b62968ff6f0d45ec0203d\" pid:5827 exited_at:{seconds:1742498339 nanos:988283755}" Mar 20 19:19:07.697793 kubelet[2812]: I0320 19:19:07.697640 2812 scope.go:117] "RemoveContainer" containerID="8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c" Mar 20 19:19:07.702907 containerd[1481]: time="2025-03-20T19:19:07.702821197Z" level=info msg="RemoveContainer for \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\"" Mar 20 19:19:07.719861 containerd[1481]: time="2025-03-20T19:19:07.719568554Z" level=info msg="RemoveContainer for \"8259741d24cd6be0234bb37f0c5790fa0cb123029b1f7e69e5758dd84786296c\" returns successfully" Mar 20 19:19:07.724186 containerd[1481]: time="2025-03-20T19:19:07.724124409Z" level=info msg="StopPodSandbox for \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\"" Mar 20 19:19:07.724959 containerd[1481]: time="2025-03-20T19:19:07.724693130Z" level=info msg="TearDown network for sandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" successfully" Mar 20 19:19:07.724959 containerd[1481]: time="2025-03-20T19:19:07.724833396Z" level=info msg="StopPodSandbox for \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" returns successfully" Mar 20 19:19:07.726258 containerd[1481]: time="2025-03-20T19:19:07.726199481Z" level=info msg="RemovePodSandbox for \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\"" Mar 20 19:19:07.726451 containerd[1481]: time="2025-03-20T19:19:07.726265797Z" level=info msg="Forcibly stopping sandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\"" Mar 20 19:19:07.726880 containerd[1481]: time="2025-03-20T19:19:07.726551169Z" level=info msg="TearDown network for sandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" successfully" Mar 20 19:19:07.730161 containerd[1481]: time="2025-03-20T19:19:07.730088730Z" level=info msg="Ensure that sandbox 9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7 in task-service has been cleanup successfully" Mar 20 19:19:07.739628 containerd[1481]: time="2025-03-20T19:19:07.739462889Z" level=info msg="RemovePodSandbox \"9d4f1cf778b76894b0fd6a4599fa7e2537658bf00d80801825d43f139c3a59f7\" returns successfully" Mar 20 19:19:07.740642 containerd[1481]: time="2025-03-20T19:19:07.740340517Z" level=info msg="StopPodSandbox for \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\"" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.819 [WARNING][5851] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.819 [INFO][5851] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.819 [INFO][5851] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" iface="eth0" netns="" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.819 [INFO][5851] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.819 [INFO][5851] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.857 [INFO][5858] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.857 [INFO][5858] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.857 [INFO][5858] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.868 [WARNING][5858] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.868 [INFO][5858] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.871 [INFO][5858] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:19:07.873743 containerd[1481]: 2025-03-20 19:19:07.872 [INFO][5851] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.875327 containerd[1481]: time="2025-03-20T19:19:07.873782489Z" level=info msg="TearDown network for sandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" successfully" Mar 20 19:19:07.875327 containerd[1481]: time="2025-03-20T19:19:07.873809962Z" level=info msg="StopPodSandbox for \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" returns successfully" Mar 20 19:19:07.875327 containerd[1481]: time="2025-03-20T19:19:07.874770215Z" level=info msg="RemovePodSandbox for \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\"" Mar 20 19:19:07.875327 containerd[1481]: time="2025-03-20T19:19:07.874815973Z" level=info msg="Forcibly stopping sandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\"" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.925 [WARNING][5876] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" WorkloadEndpoint="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.925 [INFO][5876] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.925 [INFO][5876] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" iface="eth0" netns="" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.925 [INFO][5876] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.925 [INFO][5876] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.946 [INFO][5883] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.946 [INFO][5883] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.946 [INFO][5883] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.954 [WARNING][5883] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.955 [INFO][5883] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" HandleID="k8s-pod-network.afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Workload="ci--9999--0--1--1--f6fba67404.novalocal-k8s-calico--kube--controllers--98754bff4--5m8cp-eth0" Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.957 [INFO][5883] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 20 19:19:07.959395 containerd[1481]: 2025-03-20 19:19:07.958 [INFO][5876] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00" Mar 20 19:19:07.959785 containerd[1481]: time="2025-03-20T19:19:07.959725685Z" level=info msg="TearDown network for sandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" successfully" Mar 20 19:19:07.961613 containerd[1481]: time="2025-03-20T19:19:07.961587531Z" level=info msg="Ensure that sandbox afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00 in task-service has been cleanup successfully" Mar 20 19:19:07.966065 containerd[1481]: time="2025-03-20T19:19:07.966029370Z" level=info msg="RemovePodSandbox \"afc88634433634ddcf57ee1f6936c543f0777a12b5d095d39ded369bd5329e00\" returns successfully" Mar 20 19:19:07.966832 containerd[1481]: time="2025-03-20T19:19:07.966593502Z" level=info msg="StopPodSandbox for \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\"" Mar 20 19:19:07.966832 containerd[1481]: time="2025-03-20T19:19:07.966728939Z" level=info msg="TearDown network for sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" successfully" Mar 20 19:19:07.966832 containerd[1481]: time="2025-03-20T19:19:07.966743227Z" level=info msg="StopPodSandbox for \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" returns successfully" Mar 20 19:19:07.967533 containerd[1481]: time="2025-03-20T19:19:07.967300625Z" level=info msg="RemovePodSandbox for \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\"" Mar 20 19:19:07.967533 containerd[1481]: time="2025-03-20T19:19:07.967324851Z" level=info msg="Forcibly stopping sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\"" Mar 20 19:19:07.967533 containerd[1481]: time="2025-03-20T19:19:07.967408721Z" level=info msg="TearDown network for sandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" successfully" Mar 20 19:19:07.968957 containerd[1481]: time="2025-03-20T19:19:07.968933808Z" level=info msg="Ensure that sandbox dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf in task-service has been cleanup successfully" Mar 20 19:19:07.972877 containerd[1481]: time="2025-03-20T19:19:07.972844828Z" level=info msg="RemovePodSandbox \"dba820733cd4ef0e877be12cd2f4e5ff6cadef2bd05b879d7cdcffb8dad605cf\" returns successfully" Mar 20 19:19:26.599249 containerd[1481]: time="2025-03-20T19:19:26.598978585Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"9e688965cfea5cb83309860ee499b05c601fc14f513ed30a3a6266a33d7a0502\" pid:5914 exited_at:{seconds:1742498366 nanos:598580618}" Mar 20 19:19:29.991959 containerd[1481]: time="2025-03-20T19:19:29.991906465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"0f4a7825ce80e7b6008feefa89ba1330154556b8c6ca6b38645cf894ef4d9fda\" pid:5953 exited_at:{seconds:1742498369 nanos:991009770}" Mar 20 19:19:29.994581 containerd[1481]: time="2025-03-20T19:19:29.994504677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"cac9471e26b008ec82d601faa85077a9e19bdab9086613ac45183928aba5c4c6\" pid:5952 exited_at:{seconds:1742498369 nanos:994167466}" Mar 20 19:19:38.012851 systemd[1]: Started sshd@9-172.24.4.12:22-172.24.4.1:49490.service - OpenSSH per-connection server daemon (172.24.4.1:49490). Mar 20 19:19:39.337651 sshd[5977]: Accepted publickey for core from 172.24.4.1 port 49490 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:19:39.344285 sshd-session[5977]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:19:39.358875 systemd-logind[1464]: New session 12 of user core. Mar 20 19:19:39.368701 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 20 19:19:40.220397 sshd[5979]: Connection closed by 172.24.4.1 port 49490 Mar 20 19:19:40.221510 sshd-session[5977]: pam_unix(sshd:session): session closed for user core Mar 20 19:19:40.226397 systemd[1]: sshd@9-172.24.4.12:22-172.24.4.1:49490.service: Deactivated successfully. Mar 20 19:19:40.229284 systemd[1]: session-12.scope: Deactivated successfully. Mar 20 19:19:40.231619 systemd-logind[1464]: Session 12 logged out. Waiting for processes to exit. Mar 20 19:19:40.233141 systemd-logind[1464]: Removed session 12. Mar 20 19:19:45.247882 systemd[1]: Started sshd@10-172.24.4.12:22-172.24.4.1:42428.service - OpenSSH per-connection server daemon (172.24.4.1:42428). Mar 20 19:19:46.532291 sshd[5992]: Accepted publickey for core from 172.24.4.1 port 42428 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:19:46.535246 sshd-session[5992]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:19:46.549325 systemd-logind[1464]: New session 13 of user core. Mar 20 19:19:46.558694 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 20 19:19:47.285640 sshd[5994]: Connection closed by 172.24.4.1 port 42428 Mar 20 19:19:47.286084 sshd-session[5992]: pam_unix(sshd:session): session closed for user core Mar 20 19:19:47.293761 systemd-logind[1464]: Session 13 logged out. Waiting for processes to exit. Mar 20 19:19:47.295118 systemd[1]: sshd@10-172.24.4.12:22-172.24.4.1:42428.service: Deactivated successfully. Mar 20 19:19:47.299236 systemd[1]: session-13.scope: Deactivated successfully. Mar 20 19:19:47.304345 systemd-logind[1464]: Removed session 13. Mar 20 19:19:52.307552 systemd[1]: Started sshd@11-172.24.4.12:22-172.24.4.1:42438.service - OpenSSH per-connection server daemon (172.24.4.1:42438). Mar 20 19:19:53.567183 sshd[6014]: Accepted publickey for core from 172.24.4.1 port 42438 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:19:53.569967 sshd-session[6014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:19:53.582521 systemd-logind[1464]: New session 14 of user core. Mar 20 19:19:53.588750 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 20 19:19:54.272496 sshd[6016]: Connection closed by 172.24.4.1 port 42438 Mar 20 19:19:54.271890 sshd-session[6014]: pam_unix(sshd:session): session closed for user core Mar 20 19:19:54.285946 systemd[1]: sshd@11-172.24.4.12:22-172.24.4.1:42438.service: Deactivated successfully. Mar 20 19:19:54.288421 systemd[1]: session-14.scope: Deactivated successfully. Mar 20 19:19:54.289884 systemd-logind[1464]: Session 14 logged out. Waiting for processes to exit. Mar 20 19:19:54.293280 systemd[1]: Started sshd@12-172.24.4.12:22-172.24.4.1:50714.service - OpenSSH per-connection server daemon (172.24.4.1:50714). Mar 20 19:19:54.295029 systemd-logind[1464]: Removed session 14. Mar 20 19:19:55.654565 sshd[6030]: Accepted publickey for core from 172.24.4.1 port 50714 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:19:55.657324 sshd-session[6030]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:19:55.671170 systemd-logind[1464]: New session 15 of user core. Mar 20 19:19:55.681725 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 20 19:19:56.490694 sshd[6033]: Connection closed by 172.24.4.1 port 50714 Mar 20 19:19:56.491017 sshd-session[6030]: pam_unix(sshd:session): session closed for user core Mar 20 19:19:56.503521 systemd[1]: sshd@12-172.24.4.12:22-172.24.4.1:50714.service: Deactivated successfully. Mar 20 19:19:56.508182 systemd[1]: session-15.scope: Deactivated successfully. Mar 20 19:19:56.511849 systemd-logind[1464]: Session 15 logged out. Waiting for processes to exit. Mar 20 19:19:56.517403 systemd[1]: Started sshd@13-172.24.4.12:22-172.24.4.1:50722.service - OpenSSH per-connection server daemon (172.24.4.1:50722). Mar 20 19:19:56.520871 systemd-logind[1464]: Removed session 15. Mar 20 19:19:56.609315 containerd[1481]: time="2025-03-20T19:19:56.609012967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"2bcc58d1288e294ee7873039fb10daf41a7d4652ecd2a8977f64585a7968378c\" pid:6057 exited_at:{seconds:1742498396 nanos:608677768}" Mar 20 19:19:57.688118 sshd[6042]: Accepted publickey for core from 172.24.4.1 port 50722 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:19:57.691002 sshd-session[6042]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:19:57.704483 systemd-logind[1464]: New session 16 of user core. Mar 20 19:19:57.713738 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 20 19:19:58.525416 sshd[6069]: Connection closed by 172.24.4.1 port 50722 Mar 20 19:19:58.526819 sshd-session[6042]: pam_unix(sshd:session): session closed for user core Mar 20 19:19:58.534694 systemd[1]: sshd@13-172.24.4.12:22-172.24.4.1:50722.service: Deactivated successfully. Mar 20 19:19:58.538295 systemd[1]: session-16.scope: Deactivated successfully. Mar 20 19:19:58.543677 systemd-logind[1464]: Session 16 logged out. Waiting for processes to exit. Mar 20 19:19:58.546179 systemd-logind[1464]: Removed session 16. Mar 20 19:19:59.988993 containerd[1481]: time="2025-03-20T19:19:59.988953053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"2fbc8c0c1a8c4522fe9dd14d1ddaa25e2acdb457abb9ec1fcffefd5099046260\" pid:6093 exited_at:{seconds:1742498399 nanos:988602426}" Mar 20 19:20:03.555102 systemd[1]: Started sshd@14-172.24.4.12:22-172.24.4.1:40088.service - OpenSSH per-connection server daemon (172.24.4.1:40088). Mar 20 19:20:04.828232 sshd[6113]: Accepted publickey for core from 172.24.4.1 port 40088 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:04.831298 sshd-session[6113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:04.843989 systemd-logind[1464]: New session 17 of user core. Mar 20 19:20:04.851696 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 20 19:20:05.904276 sshd[6115]: Connection closed by 172.24.4.1 port 40088 Mar 20 19:20:05.905171 sshd-session[6113]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:05.913726 systemd[1]: sshd@14-172.24.4.12:22-172.24.4.1:40088.service: Deactivated successfully. Mar 20 19:20:05.919288 systemd[1]: session-17.scope: Deactivated successfully. Mar 20 19:20:05.921821 systemd-logind[1464]: Session 17 logged out. Waiting for processes to exit. Mar 20 19:20:05.924446 systemd-logind[1464]: Removed session 17. Mar 20 19:20:10.929628 systemd[1]: Started sshd@15-172.24.4.12:22-172.24.4.1:40104.service - OpenSSH per-connection server daemon (172.24.4.1:40104). Mar 20 19:20:12.294247 sshd[6140]: Accepted publickey for core from 172.24.4.1 port 40104 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:12.297085 sshd-session[6140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:12.309429 systemd-logind[1464]: New session 18 of user core. Mar 20 19:20:12.315703 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 20 19:20:13.076175 sshd[6142]: Connection closed by 172.24.4.1 port 40104 Mar 20 19:20:13.077858 sshd-session[6140]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:13.084910 systemd[1]: sshd@15-172.24.4.12:22-172.24.4.1:40104.service: Deactivated successfully. Mar 20 19:20:13.091191 systemd[1]: session-18.scope: Deactivated successfully. Mar 20 19:20:13.093296 systemd-logind[1464]: Session 18 logged out. Waiting for processes to exit. Mar 20 19:20:13.096131 systemd-logind[1464]: Removed session 18. Mar 20 19:20:18.100895 systemd[1]: Started sshd@16-172.24.4.12:22-172.24.4.1:53218.service - OpenSSH per-connection server daemon (172.24.4.1:53218). Mar 20 19:20:19.337286 sshd[6153]: Accepted publickey for core from 172.24.4.1 port 53218 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:19.340865 sshd-session[6153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:19.355567 systemd-logind[1464]: New session 19 of user core. Mar 20 19:20:19.363747 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 20 19:20:20.076539 sshd[6155]: Connection closed by 172.24.4.1 port 53218 Mar 20 19:20:20.077569 sshd-session[6153]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:20.097890 systemd[1]: sshd@16-172.24.4.12:22-172.24.4.1:53218.service: Deactivated successfully. Mar 20 19:20:20.103579 systemd[1]: session-19.scope: Deactivated successfully. Mar 20 19:20:20.107296 systemd-logind[1464]: Session 19 logged out. Waiting for processes to exit. Mar 20 19:20:20.114025 systemd[1]: Started sshd@17-172.24.4.12:22-172.24.4.1:53226.service - OpenSSH per-connection server daemon (172.24.4.1:53226). Mar 20 19:20:20.121564 systemd-logind[1464]: Removed session 19. Mar 20 19:20:21.370269 sshd[6165]: Accepted publickey for core from 172.24.4.1 port 53226 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:21.372650 sshd-session[6165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:21.384945 systemd-logind[1464]: New session 20 of user core. Mar 20 19:20:21.391672 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 20 19:20:22.565894 sshd[6168]: Connection closed by 172.24.4.1 port 53226 Mar 20 19:20:22.567648 sshd-session[6165]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:22.580267 systemd[1]: sshd@17-172.24.4.12:22-172.24.4.1:53226.service: Deactivated successfully. Mar 20 19:20:22.584025 systemd[1]: session-20.scope: Deactivated successfully. Mar 20 19:20:22.588223 systemd-logind[1464]: Session 20 logged out. Waiting for processes to exit. Mar 20 19:20:22.592002 systemd[1]: Started sshd@18-172.24.4.12:22-172.24.4.1:53236.service - OpenSSH per-connection server daemon (172.24.4.1:53236). Mar 20 19:20:22.597194 systemd-logind[1464]: Removed session 20. Mar 20 19:20:23.839414 sshd[6177]: Accepted publickey for core from 172.24.4.1 port 53236 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:23.842714 sshd-session[6177]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:23.856074 systemd-logind[1464]: New session 21 of user core. Mar 20 19:20:23.864634 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 20 19:20:26.618538 containerd[1481]: time="2025-03-20T19:20:26.618489241Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"bbe82d92edac3ebb9f53ed2ab989928b4e051985b98b184a386dec0e0de74845\" pid:6207 exited_at:{seconds:1742498426 nanos:617458630}" Mar 20 19:20:26.968842 sshd[6182]: Connection closed by 172.24.4.1 port 53236 Mar 20 19:20:26.970241 sshd-session[6177]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:26.981566 systemd[1]: Started sshd@19-172.24.4.12:22-172.24.4.1:45118.service - OpenSSH per-connection server daemon (172.24.4.1:45118). Mar 20 19:20:26.982030 systemd[1]: sshd@18-172.24.4.12:22-172.24.4.1:53236.service: Deactivated successfully. Mar 20 19:20:26.985025 systemd[1]: session-21.scope: Deactivated successfully. Mar 20 19:20:26.985344 systemd[1]: session-21.scope: Consumed 790ms CPU time, 70.7M memory peak. Mar 20 19:20:26.988775 systemd-logind[1464]: Session 21 logged out. Waiting for processes to exit. Mar 20 19:20:26.990552 systemd-logind[1464]: Removed session 21. Mar 20 19:20:28.195099 sshd[6224]: Accepted publickey for core from 172.24.4.1 port 45118 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:28.198699 sshd-session[6224]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:28.211507 systemd-logind[1464]: New session 22 of user core. Mar 20 19:20:28.217692 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 20 19:20:29.417523 sshd[6229]: Connection closed by 172.24.4.1 port 45118 Mar 20 19:20:29.416755 sshd-session[6224]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:29.433574 systemd[1]: sshd@19-172.24.4.12:22-172.24.4.1:45118.service: Deactivated successfully. Mar 20 19:20:29.437818 systemd[1]: session-22.scope: Deactivated successfully. Mar 20 19:20:29.440977 systemd-logind[1464]: Session 22 logged out. Waiting for processes to exit. Mar 20 19:20:29.444804 systemd[1]: Started sshd@20-172.24.4.12:22-172.24.4.1:45130.service - OpenSSH per-connection server daemon (172.24.4.1:45130). Mar 20 19:20:29.448624 systemd-logind[1464]: Removed session 22. Mar 20 19:20:29.964874 containerd[1481]: time="2025-03-20T19:20:29.964822074Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"056d213cc4c1702c2435dc52a73c66f26b372f3429c2842b42e58d3a989fb0be\" pid:6261 exited_at:{seconds:1742498429 nanos:964499772}" Mar 20 19:20:29.972171 containerd[1481]: time="2025-03-20T19:20:29.972133803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"b62197e8ce7f60c0ff9ad90830c4fe9783ba140b53eff2d56de2356acc76666a\" pid:6273 exited_at:{seconds:1742498429 nanos:971893525}" Mar 20 19:20:30.796768 sshd[6238]: Accepted publickey for core from 172.24.4.1 port 45130 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:30.798195 sshd-session[6238]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:30.807031 systemd-logind[1464]: New session 23 of user core. Mar 20 19:20:30.815876 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 20 19:20:31.499039 sshd[6285]: Connection closed by 172.24.4.1 port 45130 Mar 20 19:20:31.500217 sshd-session[6238]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:31.507310 systemd[1]: sshd@20-172.24.4.12:22-172.24.4.1:45130.service: Deactivated successfully. Mar 20 19:20:31.512012 systemd[1]: session-23.scope: Deactivated successfully. Mar 20 19:20:31.517688 systemd-logind[1464]: Session 23 logged out. Waiting for processes to exit. Mar 20 19:20:31.521647 systemd-logind[1464]: Removed session 23. Mar 20 19:20:36.521162 systemd[1]: Started sshd@21-172.24.4.12:22-172.24.4.1:40586.service - OpenSSH per-connection server daemon (172.24.4.1:40586). Mar 20 19:20:37.665728 sshd[6300]: Accepted publickey for core from 172.24.4.1 port 40586 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:37.668677 sshd-session[6300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:37.681465 systemd-logind[1464]: New session 24 of user core. Mar 20 19:20:37.687718 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 20 19:20:38.500869 sshd[6302]: Connection closed by 172.24.4.1 port 40586 Mar 20 19:20:38.501960 sshd-session[6300]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:38.509100 systemd[1]: sshd@21-172.24.4.12:22-172.24.4.1:40586.service: Deactivated successfully. Mar 20 19:20:38.513649 systemd[1]: session-24.scope: Deactivated successfully. Mar 20 19:20:38.517673 systemd-logind[1464]: Session 24 logged out. Waiting for processes to exit. Mar 20 19:20:38.520677 systemd-logind[1464]: Removed session 24. Mar 20 19:20:43.526262 systemd[1]: Started sshd@22-172.24.4.12:22-172.24.4.1:42590.service - OpenSSH per-connection server daemon (172.24.4.1:42590). Mar 20 19:20:44.666104 sshd[6314]: Accepted publickey for core from 172.24.4.1 port 42590 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:44.668503 sshd-session[6314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:44.679883 systemd-logind[1464]: New session 25 of user core. Mar 20 19:20:44.691651 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 20 19:20:45.500884 sshd[6316]: Connection closed by 172.24.4.1 port 42590 Mar 20 19:20:45.501711 sshd-session[6314]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:45.509733 systemd[1]: sshd@22-172.24.4.12:22-172.24.4.1:42590.service: Deactivated successfully. Mar 20 19:20:45.514876 systemd[1]: session-25.scope: Deactivated successfully. Mar 20 19:20:45.517714 systemd-logind[1464]: Session 25 logged out. Waiting for processes to exit. Mar 20 19:20:45.520421 systemd-logind[1464]: Removed session 25. Mar 20 19:20:50.524857 systemd[1]: Started sshd@23-172.24.4.12:22-172.24.4.1:42602.service - OpenSSH per-connection server daemon (172.24.4.1:42602). Mar 20 19:20:51.665517 sshd[6328]: Accepted publickey for core from 172.24.4.1 port 42602 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:51.668249 sshd-session[6328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:51.681235 systemd-logind[1464]: New session 26 of user core. Mar 20 19:20:51.691659 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 20 19:20:52.500646 sshd[6330]: Connection closed by 172.24.4.1 port 42602 Mar 20 19:20:52.501969 sshd-session[6328]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:52.508507 systemd[1]: sshd@23-172.24.4.12:22-172.24.4.1:42602.service: Deactivated successfully. Mar 20 19:20:52.515317 systemd[1]: session-26.scope: Deactivated successfully. Mar 20 19:20:52.521976 systemd-logind[1464]: Session 26 logged out. Waiting for processes to exit. Mar 20 19:20:52.526689 systemd-logind[1464]: Removed session 26. Mar 20 19:20:56.632266 containerd[1481]: time="2025-03-20T19:20:56.632202964Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"be37781c2db1507f73925749955e52026ce15f0fccdbc5dbc4084ba162197a48\" pid:6356 exited_at:{seconds:1742498456 nanos:631905556}" Mar 20 19:20:57.534867 systemd[1]: Started sshd@24-172.24.4.12:22-172.24.4.1:43744.service - OpenSSH per-connection server daemon (172.24.4.1:43744). Mar 20 19:20:58.539623 sshd[6368]: Accepted publickey for core from 172.24.4.1 port 43744 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:20:58.547745 sshd-session[6368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:20:58.560501 systemd-logind[1464]: New session 27 of user core. Mar 20 19:20:58.572847 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 20 19:20:59.544513 sshd[6370]: Connection closed by 172.24.4.1 port 43744 Mar 20 19:20:59.545670 sshd-session[6368]: pam_unix(sshd:session): session closed for user core Mar 20 19:20:59.554152 systemd[1]: sshd@24-172.24.4.12:22-172.24.4.1:43744.service: Deactivated successfully. Mar 20 19:20:59.559346 systemd[1]: session-27.scope: Deactivated successfully. Mar 20 19:20:59.562592 systemd-logind[1464]: Session 27 logged out. Waiting for processes to exit. Mar 20 19:20:59.567102 systemd-logind[1464]: Removed session 27. Mar 20 19:20:59.985009 containerd[1481]: time="2025-03-20T19:20:59.984749214Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"247812b08cc3b33477f59ea2d4603e4e4f4d46e07fe5cfb230a3b430efa6e13c\" pid:6393 exited_at:{seconds:1742498459 nanos:984295241}" Mar 20 19:21:04.569320 systemd[1]: Started sshd@25-172.24.4.12:22-172.24.4.1:44600.service - OpenSSH per-connection server daemon (172.24.4.1:44600). Mar 20 19:21:05.705852 sshd[6403]: Accepted publickey for core from 172.24.4.1 port 44600 ssh2: RSA SHA256:IYiaHveyfu43JXceR/qCHbfogqPL8VsPnULJnxdX2UQ Mar 20 19:21:05.710100 sshd-session[6403]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 20 19:21:05.726716 systemd-logind[1464]: New session 28 of user core. Mar 20 19:21:05.733164 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 20 19:21:06.377424 sshd[6406]: Connection closed by 172.24.4.1 port 44600 Mar 20 19:21:06.378990 sshd-session[6403]: pam_unix(sshd:session): session closed for user core Mar 20 19:21:06.384752 systemd[1]: sshd@25-172.24.4.12:22-172.24.4.1:44600.service: Deactivated successfully. Mar 20 19:21:06.390416 systemd[1]: session-28.scope: Deactivated successfully. Mar 20 19:21:06.395176 systemd-logind[1464]: Session 28 logged out. Waiting for processes to exit. Mar 20 19:21:06.397904 systemd-logind[1464]: Removed session 28. Mar 20 19:21:26.629054 containerd[1481]: time="2025-03-20T19:21:26.628557242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"daedd2c22561a66e273f97317c38501c341df7c5cb55e220909b905d738ae156\" pid:6442 exited_at:{seconds:1742498486 nanos:627912567}" Mar 20 19:21:29.990716 containerd[1481]: time="2025-03-20T19:21:29.990673796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"675a34ff541b93309b8e6fbc93d7d2978d2d78fd49a6641850ae8de1a1378acc\" pid:6477 exited_at:{seconds:1742498489 nanos:990461676}" Mar 20 19:21:29.998472 containerd[1481]: time="2025-03-20T19:21:29.998094834Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"4fd45d0c2407fda0b6aa6f11eff0feb595fa1f707c11d8c98b8ade4dec6caa4b\" pid:6486 exited_at:{seconds:1742498489 nanos:997791472}" Mar 20 19:21:56.625667 containerd[1481]: time="2025-03-20T19:21:56.625558306Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5adff4313c722eea368cb95196ccfeef1c10477345e6d5574e3b82ba302d098c\" id:\"7f0203655ed256a934c1212dac7c84fed95fa69c34f6e8f0e66d28af8a20e844\" pid:6533 exited_at:{seconds:1742498516 nanos:624596579}" Mar 20 19:21:59.993540 containerd[1481]: time="2025-03-20T19:21:59.993376537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7dccd876d7294d9874eb7d0f86d8caac756480c838e770109fc773f52da53959\" id:\"1d09037ff50b3c915416f83d91741cf1719bbd82f937da38c403b251fcfcb971\" pid:6557 exited_at:{seconds:1742498519 nanos:992569952}" Mar 20 19:22:02.227149 containerd[1481]: time="2025-03-20T19:22:02.226940689Z" level=warning msg="container event discarded" container=6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a type=CONTAINER_CREATED_EVENT Mar 20 19:22:02.238527 containerd[1481]: time="2025-03-20T19:22:02.238328071Z" level=warning msg="container event discarded" container=6a4a0a5919b4a95cf80b66d73aea46152e09eedd1d95c46b80aede30dca3158a type=CONTAINER_STARTED_EVENT Mar 20 19:22:02.238527 containerd[1481]: time="2025-03-20T19:22:02.238450032Z" level=warning msg="container event discarded" container=177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb type=CONTAINER_CREATED_EVENT Mar 20 19:22:02.238527 containerd[1481]: time="2025-03-20T19:22:02.238474077Z" level=warning msg="container event discarded" container=177919e0f7e1386ca629b4847a4936869261fee48ffd0713f6d659c34215b8eb type=CONTAINER_STARTED_EVENT Mar 20 19:22:02.257198 containerd[1481]: time="2025-03-20T19:22:02.257098687Z" level=warning msg="container event discarded" container=2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334 type=CONTAINER_CREATED_EVENT Mar 20 19:22:02.257198 containerd[1481]: time="2025-03-20T19:22:02.257179128Z" level=warning msg="container event discarded" container=2e53644d0b037fcb3c9af1e8659c02cb282e6a8bfcadf9bfb7ec9c705022a334 type=CONTAINER_STARTED_EVENT Mar 20 19:22:02.286012 containerd[1481]: time="2025-03-20T19:22:02.285893147Z" level=warning msg="container event discarded" container=ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8 type=CONTAINER_CREATED_EVENT Mar 20 19:22:02.286012 containerd[1481]: time="2025-03-20T19:22:02.285960384Z" level=warning msg="container event discarded" container=151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242 type=CONTAINER_CREATED_EVENT Mar 20 19:22:02.297486 containerd[1481]: time="2025-03-20T19:22:02.297381098Z" level=warning msg="container event discarded" container=0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83 type=CONTAINER_CREATED_EVENT Mar 20 19:22:02.397268 containerd[1481]: time="2025-03-20T19:22:02.397025160Z" level=warning msg="container event discarded" container=ac9ebf61e868f230579f990a5fe7531dcfeb3f5dd7dbee194d3707c97eba1ae8 type=CONTAINER_STARTED_EVENT Mar 20 19:22:02.422572 containerd[1481]: time="2025-03-20T19:22:02.422458048Z" level=warning msg="container event discarded" container=151204fc5434340e64ca0ec8a44bc90dd2cffaa5d2dc17db7db7f7a93d525242 type=CONTAINER_STARTED_EVENT Mar 20 19:22:02.422572 containerd[1481]: time="2025-03-20T19:22:02.422534522Z" level=warning msg="container event discarded" container=0d5167d1744260f4c6a1dc4e65868c0ef627d59856b347ea484d5fd3fc7a7e83 type=CONTAINER_STARTED_EVENT