Mar 21 13:30:42.018311 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Mar 21 10:52:59 -00 2025 Mar 21 13:30:42.018343 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=fb715041d083099c6a15c8aee7cc93fc3f3ca8764fc0aaaff245a06641d663d2 Mar 21 13:30:42.018353 kernel: BIOS-provided physical RAM map: Mar 21 13:30:42.018361 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 21 13:30:42.018369 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 21 13:30:42.018379 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 21 13:30:42.018388 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Mar 21 13:30:42.018396 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Mar 21 13:30:42.018403 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 21 13:30:42.018411 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 21 13:30:42.018419 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Mar 21 13:30:42.018426 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 21 13:30:42.018434 kernel: NX (Execute Disable) protection: active Mar 21 13:30:42.018442 kernel: APIC: Static calls initialized Mar 21 13:30:42.018453 kernel: SMBIOS 3.0.0 present. Mar 21 13:30:42.018461 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Mar 21 13:30:42.018469 kernel: Hypervisor detected: KVM Mar 21 13:30:42.018477 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 21 13:30:42.018485 kernel: kvm-clock: using sched offset of 3655096760 cycles Mar 21 13:30:42.018494 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 21 13:30:42.018504 kernel: tsc: Detected 1996.249 MHz processor Mar 21 13:30:42.018513 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 21 13:30:42.018522 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 21 13:30:42.018530 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Mar 21 13:30:42.018539 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 21 13:30:42.018547 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 21 13:30:42.018556 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Mar 21 13:30:42.018564 kernel: ACPI: Early table checksum verification disabled Mar 21 13:30:42.018574 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Mar 21 13:30:42.018582 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 13:30:42.018590 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 13:30:42.018599 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 13:30:42.018607 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Mar 21 13:30:42.018615 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 13:30:42.018624 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 21 13:30:42.018632 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Mar 21 13:30:42.018640 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Mar 21 13:30:42.018650 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Mar 21 13:30:42.018659 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Mar 21 13:30:42.018667 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Mar 21 13:30:42.018679 kernel: No NUMA configuration found Mar 21 13:30:42.018687 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Mar 21 13:30:42.018715 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] Mar 21 13:30:42.018724 kernel: Zone ranges: Mar 21 13:30:42.018736 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 21 13:30:42.018744 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 21 13:30:42.018753 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Mar 21 13:30:42.018762 kernel: Movable zone start for each node Mar 21 13:30:42.018770 kernel: Early memory node ranges Mar 21 13:30:42.018778 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 21 13:30:42.018788 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Mar 21 13:30:42.018796 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Mar 21 13:30:42.018807 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Mar 21 13:30:42.018816 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 21 13:30:42.018824 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 21 13:30:42.018833 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Mar 21 13:30:42.018841 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 21 13:30:42.018850 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 21 13:30:42.018859 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 21 13:30:42.018867 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 21 13:30:42.018876 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 21 13:30:42.018887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 21 13:30:42.018895 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 21 13:30:42.018904 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 21 13:30:42.018912 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 21 13:30:42.018921 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 21 13:30:42.018930 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 21 13:30:42.018938 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Mar 21 13:30:42.018947 kernel: Booting paravirtualized kernel on KVM Mar 21 13:30:42.018955 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 21 13:30:42.018966 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 21 13:30:42.018975 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 21 13:30:42.018983 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 21 13:30:42.018992 kernel: pcpu-alloc: [0] 0 1 Mar 21 13:30:42.019000 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 21 13:30:42.019010 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=fb715041d083099c6a15c8aee7cc93fc3f3ca8764fc0aaaff245a06641d663d2 Mar 21 13:30:42.019019 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 21 13:30:42.019028 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 21 13:30:42.019039 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 21 13:30:42.019048 kernel: Fallback order for Node 0: 0 Mar 21 13:30:42.019057 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 Mar 21 13:30:42.019065 kernel: Policy zone: Normal Mar 21 13:30:42.019073 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 21 13:30:42.019082 kernel: software IO TLB: area num 2. Mar 21 13:30:42.019091 kernel: Memory: 3962108K/4193772K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43588K init, 1476K bss, 231404K reserved, 0K cma-reserved) Mar 21 13:30:42.019100 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 21 13:30:42.019108 kernel: ftrace: allocating 37985 entries in 149 pages Mar 21 13:30:42.019119 kernel: ftrace: allocated 149 pages with 4 groups Mar 21 13:30:42.019127 kernel: Dynamic Preempt: voluntary Mar 21 13:30:42.019136 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 21 13:30:42.019146 kernel: rcu: RCU event tracing is enabled. Mar 21 13:30:42.019155 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 21 13:30:42.019164 kernel: Trampoline variant of Tasks RCU enabled. Mar 21 13:30:42.019173 kernel: Rude variant of Tasks RCU enabled. Mar 21 13:30:42.019182 kernel: Tracing variant of Tasks RCU enabled. Mar 21 13:30:42.019190 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 21 13:30:42.019201 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 21 13:30:42.019209 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 21 13:30:42.019218 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 21 13:30:42.019227 kernel: Console: colour VGA+ 80x25 Mar 21 13:30:42.019235 kernel: printk: console [tty0] enabled Mar 21 13:30:42.019244 kernel: printk: console [ttyS0] enabled Mar 21 13:30:42.019253 kernel: ACPI: Core revision 20230628 Mar 21 13:30:42.019262 kernel: APIC: Switch to symmetric I/O mode setup Mar 21 13:30:42.019270 kernel: x2apic enabled Mar 21 13:30:42.019279 kernel: APIC: Switched APIC routing to: physical x2apic Mar 21 13:30:42.019289 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 21 13:30:42.019298 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 21 13:30:42.019307 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Mar 21 13:30:42.019315 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 21 13:30:42.019324 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 21 13:30:42.019333 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 21 13:30:42.019342 kernel: Spectre V2 : Mitigation: Retpolines Mar 21 13:30:42.019350 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 21 13:30:42.019361 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 21 13:30:42.019369 kernel: Speculative Store Bypass: Vulnerable Mar 21 13:30:42.019378 kernel: x86/fpu: x87 FPU will use FXSAVE Mar 21 13:30:42.019387 kernel: Freeing SMP alternatives memory: 32K Mar 21 13:30:42.019401 kernel: pid_max: default: 32768 minimum: 301 Mar 21 13:30:42.019412 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 21 13:30:42.019421 kernel: landlock: Up and running. Mar 21 13:30:42.019430 kernel: SELinux: Initializing. Mar 21 13:30:42.019439 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 21 13:30:42.019448 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 21 13:30:42.019457 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Mar 21 13:30:42.019467 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 21 13:30:42.019478 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 21 13:30:42.019487 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 21 13:30:42.019496 kernel: Performance Events: AMD PMU driver. Mar 21 13:30:42.019505 kernel: ... version: 0 Mar 21 13:30:42.019514 kernel: ... bit width: 48 Mar 21 13:30:42.019525 kernel: ... generic registers: 4 Mar 21 13:30:42.019534 kernel: ... value mask: 0000ffffffffffff Mar 21 13:30:42.019543 kernel: ... max period: 00007fffffffffff Mar 21 13:30:42.019552 kernel: ... fixed-purpose events: 0 Mar 21 13:30:42.019561 kernel: ... event mask: 000000000000000f Mar 21 13:30:42.019570 kernel: signal: max sigframe size: 1440 Mar 21 13:30:42.019579 kernel: rcu: Hierarchical SRCU implementation. Mar 21 13:30:42.019588 kernel: rcu: Max phase no-delay instances is 400. Mar 21 13:30:42.019597 kernel: smp: Bringing up secondary CPUs ... Mar 21 13:30:42.019608 kernel: smpboot: x86: Booting SMP configuration: Mar 21 13:30:42.019617 kernel: .... node #0, CPUs: #1 Mar 21 13:30:42.019626 kernel: smp: Brought up 1 node, 2 CPUs Mar 21 13:30:42.019635 kernel: smpboot: Max logical packages: 2 Mar 21 13:30:42.019644 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Mar 21 13:30:42.019653 kernel: devtmpfs: initialized Mar 21 13:30:42.019662 kernel: x86/mm: Memory block size: 128MB Mar 21 13:30:42.019671 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 21 13:30:42.019680 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 21 13:30:42.021677 kernel: pinctrl core: initialized pinctrl subsystem Mar 21 13:30:42.021706 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 21 13:30:42.021716 kernel: audit: initializing netlink subsys (disabled) Mar 21 13:30:42.021726 kernel: audit: type=2000 audit(1742563841.369:1): state=initialized audit_enabled=0 res=1 Mar 21 13:30:42.021735 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 21 13:30:42.021744 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 21 13:30:42.021754 kernel: cpuidle: using governor menu Mar 21 13:30:42.021763 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 21 13:30:42.021772 kernel: dca service started, version 1.12.1 Mar 21 13:30:42.021786 kernel: PCI: Using configuration type 1 for base access Mar 21 13:30:42.021795 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 21 13:30:42.021804 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 21 13:30:42.021814 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 21 13:30:42.021823 kernel: ACPI: Added _OSI(Module Device) Mar 21 13:30:42.021832 kernel: ACPI: Added _OSI(Processor Device) Mar 21 13:30:42.021841 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 21 13:30:42.021850 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 21 13:30:42.021859 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 21 13:30:42.021870 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 21 13:30:42.021880 kernel: ACPI: Interpreter enabled Mar 21 13:30:42.021889 kernel: ACPI: PM: (supports S0 S3 S5) Mar 21 13:30:42.021898 kernel: ACPI: Using IOAPIC for interrupt routing Mar 21 13:30:42.021907 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 21 13:30:42.021916 kernel: PCI: Using E820 reservations for host bridge windows Mar 21 13:30:42.021925 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Mar 21 13:30:42.021935 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 21 13:30:42.022094 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 21 13:30:42.022196 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Mar 21 13:30:42.022288 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Mar 21 13:30:42.022303 kernel: acpiphp: Slot [3] registered Mar 21 13:30:42.022312 kernel: acpiphp: Slot [4] registered Mar 21 13:30:42.022321 kernel: acpiphp: Slot [5] registered Mar 21 13:30:42.022330 kernel: acpiphp: Slot [6] registered Mar 21 13:30:42.022339 kernel: acpiphp: Slot [7] registered Mar 21 13:30:42.022348 kernel: acpiphp: Slot [8] registered Mar 21 13:30:42.022361 kernel: acpiphp: Slot [9] registered Mar 21 13:30:42.022370 kernel: acpiphp: Slot [10] registered Mar 21 13:30:42.022378 kernel: acpiphp: Slot [11] registered Mar 21 13:30:42.022387 kernel: acpiphp: Slot [12] registered Mar 21 13:30:42.022396 kernel: acpiphp: Slot [13] registered Mar 21 13:30:42.022405 kernel: acpiphp: Slot [14] registered Mar 21 13:30:42.022414 kernel: acpiphp: Slot [15] registered Mar 21 13:30:42.022423 kernel: acpiphp: Slot [16] registered Mar 21 13:30:42.022432 kernel: acpiphp: Slot [17] registered Mar 21 13:30:42.022443 kernel: acpiphp: Slot [18] registered Mar 21 13:30:42.022452 kernel: acpiphp: Slot [19] registered Mar 21 13:30:42.022460 kernel: acpiphp: Slot [20] registered Mar 21 13:30:42.022470 kernel: acpiphp: Slot [21] registered Mar 21 13:30:42.022479 kernel: acpiphp: Slot [22] registered Mar 21 13:30:42.022487 kernel: acpiphp: Slot [23] registered Mar 21 13:30:42.022496 kernel: acpiphp: Slot [24] registered Mar 21 13:30:42.022506 kernel: acpiphp: Slot [25] registered Mar 21 13:30:42.022514 kernel: acpiphp: Slot [26] registered Mar 21 13:30:42.022523 kernel: acpiphp: Slot [27] registered Mar 21 13:30:42.022534 kernel: acpiphp: Slot [28] registered Mar 21 13:30:42.022543 kernel: acpiphp: Slot [29] registered Mar 21 13:30:42.022552 kernel: acpiphp: Slot [30] registered Mar 21 13:30:42.022561 kernel: acpiphp: Slot [31] registered Mar 21 13:30:42.022570 kernel: PCI host bridge to bus 0000:00 Mar 21 13:30:42.022666 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 21 13:30:42.022772 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 21 13:30:42.022856 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 21 13:30:42.022943 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 21 13:30:42.023026 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Mar 21 13:30:42.023107 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 21 13:30:42.023221 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Mar 21 13:30:42.023330 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Mar 21 13:30:42.023434 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Mar 21 13:30:42.023532 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Mar 21 13:30:42.023626 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 21 13:30:42.025792 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 21 13:30:42.025904 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 21 13:30:42.026010 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 21 13:30:42.026112 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Mar 21 13:30:42.026208 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Mar 21 13:30:42.026307 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Mar 21 13:30:42.026410 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Mar 21 13:30:42.026506 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Mar 21 13:30:42.026599 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 21 13:30:42.027158 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Mar 21 13:30:42.028814 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Mar 21 13:30:42.028919 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 21 13:30:42.029035 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 21 13:30:42.029131 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Mar 21 13:30:42.029225 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Mar 21 13:30:42.029318 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] Mar 21 13:30:42.029410 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Mar 21 13:30:42.029511 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 21 13:30:42.029611 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 21 13:30:42.029724 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Mar 21 13:30:42.029819 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] Mar 21 13:30:42.029920 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Mar 21 13:30:42.030015 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Mar 21 13:30:42.030108 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] Mar 21 13:30:42.030208 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Mar 21 13:30:42.030307 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Mar 21 13:30:42.030400 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] Mar 21 13:30:42.030492 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] Mar 21 13:30:42.030506 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 21 13:30:42.030516 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 21 13:30:42.030526 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 21 13:30:42.030535 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 21 13:30:42.030544 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 21 13:30:42.030557 kernel: iommu: Default domain type: Translated Mar 21 13:30:42.030567 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 21 13:30:42.030576 kernel: PCI: Using ACPI for IRQ routing Mar 21 13:30:42.030585 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 21 13:30:42.030594 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 21 13:30:42.030604 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Mar 21 13:30:42.032728 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Mar 21 13:30:42.032892 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Mar 21 13:30:42.033046 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 21 13:30:42.033066 kernel: vgaarb: loaded Mar 21 13:30:42.033077 kernel: clocksource: Switched to clocksource kvm-clock Mar 21 13:30:42.033086 kernel: VFS: Disk quotas dquot_6.6.0 Mar 21 13:30:42.033096 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 21 13:30:42.033106 kernel: pnp: PnP ACPI init Mar 21 13:30:42.033211 kernel: pnp 00:03: [dma 2] Mar 21 13:30:42.033227 kernel: pnp: PnP ACPI: found 5 devices Mar 21 13:30:42.033237 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 21 13:30:42.033249 kernel: NET: Registered PF_INET protocol family Mar 21 13:30:42.033259 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 21 13:30:42.033268 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 21 13:30:42.033278 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 21 13:30:42.033287 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 21 13:30:42.033297 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 21 13:30:42.033306 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 21 13:30:42.033316 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 21 13:30:42.033325 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 21 13:30:42.033337 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 21 13:30:42.033346 kernel: NET: Registered PF_XDP protocol family Mar 21 13:30:42.033432 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 21 13:30:42.033516 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 21 13:30:42.033601 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 21 13:30:42.034106 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Mar 21 13:30:42.034263 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Mar 21 13:30:42.034389 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Mar 21 13:30:42.034502 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 21 13:30:42.034518 kernel: PCI: CLS 0 bytes, default 64 Mar 21 13:30:42.034528 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 21 13:30:42.034539 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) Mar 21 13:30:42.034549 kernel: Initialise system trusted keyrings Mar 21 13:30:42.034559 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 21 13:30:42.034569 kernel: Key type asymmetric registered Mar 21 13:30:42.034579 kernel: Asymmetric key parser 'x509' registered Mar 21 13:30:42.034589 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 21 13:30:42.034603 kernel: io scheduler mq-deadline registered Mar 21 13:30:42.034613 kernel: io scheduler kyber registered Mar 21 13:30:42.034623 kernel: io scheduler bfq registered Mar 21 13:30:42.034633 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 21 13:30:42.034644 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Mar 21 13:30:42.034654 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 21 13:30:42.034664 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Mar 21 13:30:42.034674 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 21 13:30:42.034685 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 21 13:30:42.034714 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 21 13:30:42.034724 kernel: random: crng init done Mar 21 13:30:42.034734 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 21 13:30:42.034744 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 21 13:30:42.034754 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 21 13:30:42.034996 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 21 13:30:42.035015 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 21 13:30:42.035105 kernel: rtc_cmos 00:04: registered as rtc0 Mar 21 13:30:42.035203 kernel: rtc_cmos 00:04: setting system clock to 2025-03-21T13:30:41 UTC (1742563841) Mar 21 13:30:42.035293 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 21 13:30:42.035308 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 21 13:30:42.035319 kernel: NET: Registered PF_INET6 protocol family Mar 21 13:30:42.035329 kernel: Segment Routing with IPv6 Mar 21 13:30:42.035338 kernel: In-situ OAM (IOAM) with IPv6 Mar 21 13:30:42.035348 kernel: NET: Registered PF_PACKET protocol family Mar 21 13:30:42.035358 kernel: Key type dns_resolver registered Mar 21 13:30:42.035369 kernel: IPI shorthand broadcast: enabled Mar 21 13:30:42.035382 kernel: sched_clock: Marking stable (957007641, 186251312)->(1178574866, -35315913) Mar 21 13:30:42.035392 kernel: registered taskstats version 1 Mar 21 13:30:42.035402 kernel: Loading compiled-in X.509 certificates Mar 21 13:30:42.035412 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: d76f2258ffed89096a9428010e5ac0a0babcea9e' Mar 21 13:30:42.035423 kernel: Key type .fscrypt registered Mar 21 13:30:42.035433 kernel: Key type fscrypt-provisioning registered Mar 21 13:30:42.035442 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 21 13:30:42.035452 kernel: ima: Allocated hash algorithm: sha1 Mar 21 13:30:42.035464 kernel: ima: No architecture policies found Mar 21 13:30:42.035474 kernel: clk: Disabling unused clocks Mar 21 13:30:42.035484 kernel: Freeing unused kernel image (initmem) memory: 43588K Mar 21 13:30:42.035494 kernel: Write protecting the kernel read-only data: 40960k Mar 21 13:30:42.035504 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 21 13:30:42.035514 kernel: Run /init as init process Mar 21 13:30:42.035524 kernel: with arguments: Mar 21 13:30:42.035534 kernel: /init Mar 21 13:30:42.035544 kernel: with environment: Mar 21 13:30:42.035553 kernel: HOME=/ Mar 21 13:30:42.035565 kernel: TERM=linux Mar 21 13:30:42.035574 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 21 13:30:42.035586 systemd[1]: Successfully made /usr/ read-only. Mar 21 13:30:42.035601 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 21 13:30:42.035612 systemd[1]: Detected virtualization kvm. Mar 21 13:30:42.035623 systemd[1]: Detected architecture x86-64. Mar 21 13:30:42.035633 systemd[1]: Running in initrd. Mar 21 13:30:42.035646 systemd[1]: No hostname configured, using default hostname. Mar 21 13:30:42.035657 systemd[1]: Hostname set to . Mar 21 13:30:42.035668 systemd[1]: Initializing machine ID from VM UUID. Mar 21 13:30:42.035678 systemd[1]: Queued start job for default target initrd.target. Mar 21 13:30:42.035689 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 13:30:42.037733 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 13:30:42.037756 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 21 13:30:42.037769 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 21 13:30:42.037780 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 21 13:30:42.037792 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 21 13:30:42.037805 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 21 13:30:42.037816 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 21 13:30:42.037829 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 13:30:42.037839 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 21 13:30:42.037850 systemd[1]: Reached target paths.target - Path Units. Mar 21 13:30:42.037861 systemd[1]: Reached target slices.target - Slice Units. Mar 21 13:30:42.037871 systemd[1]: Reached target swap.target - Swaps. Mar 21 13:30:42.037881 systemd[1]: Reached target timers.target - Timer Units. Mar 21 13:30:42.037891 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 21 13:30:42.037902 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 21 13:30:42.037912 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 21 13:30:42.037925 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 21 13:30:42.037935 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 21 13:30:42.037946 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 21 13:30:42.037956 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 13:30:42.037967 systemd[1]: Reached target sockets.target - Socket Units. Mar 21 13:30:42.037977 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 21 13:30:42.037988 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 21 13:30:42.037998 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 21 13:30:42.038010 systemd[1]: Starting systemd-fsck-usr.service... Mar 21 13:30:42.038020 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 21 13:30:42.038031 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 21 13:30:42.038041 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 13:30:42.038051 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 21 13:30:42.038061 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 13:30:42.038074 systemd[1]: Finished systemd-fsck-usr.service. Mar 21 13:30:42.038085 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 21 13:30:42.038098 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 21 13:30:42.038108 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 21 13:30:42.038145 systemd-journald[183]: Collecting audit messages is disabled. Mar 21 13:30:42.038171 systemd-journald[183]: Journal started Mar 21 13:30:42.038197 systemd-journald[183]: Runtime Journal (/run/log/journal/70ae9356508a4912891abb3abbdadb00) is 8M, max 78.2M, 70.2M free. Mar 21 13:30:42.023742 systemd-modules-load[186]: Inserted module 'overlay' Mar 21 13:30:42.063837 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 21 13:30:42.065944 kernel: Bridge firewalling registered Mar 21 13:30:42.065239 systemd-modules-load[186]: Inserted module 'br_netfilter' Mar 21 13:30:42.068246 systemd[1]: Started systemd-journald.service - Journal Service. Mar 21 13:30:42.070765 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 21 13:30:42.071485 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 13:30:42.073333 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 13:30:42.077522 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 13:30:42.078837 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 21 13:30:42.083050 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 21 13:30:42.103148 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 21 13:30:42.108918 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 13:30:42.112812 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 21 13:30:42.117844 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 13:30:42.125799 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 21 13:30:42.155149 dracut-cmdline[222]: dracut-dracut-053 Mar 21 13:30:42.160607 dracut-cmdline[222]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=fb715041d083099c6a15c8aee7cc93fc3f3ca8764fc0aaaff245a06641d663d2 Mar 21 13:30:42.163005 systemd-resolved[218]: Positive Trust Anchors: Mar 21 13:30:42.163016 systemd-resolved[218]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 21 13:30:42.163061 systemd-resolved[218]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 21 13:30:42.167415 systemd-resolved[218]: Defaulting to hostname 'linux'. Mar 21 13:30:42.168423 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 21 13:30:42.168979 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 21 13:30:42.248790 kernel: SCSI subsystem initialized Mar 21 13:30:42.259754 kernel: Loading iSCSI transport class v2.0-870. Mar 21 13:30:42.271771 kernel: iscsi: registered transport (tcp) Mar 21 13:30:42.294042 kernel: iscsi: registered transport (qla4xxx) Mar 21 13:30:42.294121 kernel: QLogic iSCSI HBA Driver Mar 21 13:30:42.353916 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 21 13:30:42.356524 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 21 13:30:42.415938 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 21 13:30:42.416029 kernel: device-mapper: uevent: version 1.0.3 Mar 21 13:30:42.419230 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 21 13:30:42.479797 kernel: raid6: sse2x4 gen() 5198 MB/s Mar 21 13:30:42.498808 kernel: raid6: sse2x2 gen() 5982 MB/s Mar 21 13:30:42.515997 kernel: raid6: sse2x1 gen() 9041 MB/s Mar 21 13:30:42.516071 kernel: raid6: using algorithm sse2x1 gen() 9041 MB/s Mar 21 13:30:42.536062 kernel: raid6: .... xor() 7293 MB/s, rmw enabled Mar 21 13:30:42.536125 kernel: raid6: using ssse3x2 recovery algorithm Mar 21 13:30:42.558754 kernel: xor: measuring software checksum speed Mar 21 13:30:42.558824 kernel: prefetch64-sse : 16982 MB/sec Mar 21 13:30:42.561083 kernel: generic_sse : 16861 MB/sec Mar 21 13:30:42.561134 kernel: xor: using function: prefetch64-sse (16982 MB/sec) Mar 21 13:30:42.743767 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 21 13:30:42.761862 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 21 13:30:42.767796 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 13:30:42.794414 systemd-udevd[405]: Using default interface naming scheme 'v255'. Mar 21 13:30:42.799797 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 13:30:42.806339 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 21 13:30:42.837838 dracut-pre-trigger[417]: rd.md=0: removing MD RAID activation Mar 21 13:30:42.882525 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 21 13:30:42.888861 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 21 13:30:42.974294 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 13:30:42.984676 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 21 13:30:43.038568 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 21 13:30:43.044324 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues Mar 21 13:30:43.066259 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Mar 21 13:30:43.066538 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 21 13:30:43.066554 kernel: GPT:17805311 != 20971519 Mar 21 13:30:43.066567 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 21 13:30:43.066579 kernel: GPT:17805311 != 20971519 Mar 21 13:30:43.066596 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 21 13:30:43.066608 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 13:30:43.043450 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 21 13:30:43.045093 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 13:30:43.049963 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 21 13:30:43.062975 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 21 13:30:43.083144 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 21 13:30:43.098754 kernel: libata version 3.00 loaded. Mar 21 13:30:43.101923 kernel: ata_piix 0000:00:01.1: version 2.13 Mar 21 13:30:43.118795 kernel: scsi host0: ata_piix Mar 21 13:30:43.118953 kernel: scsi host1: ata_piix Mar 21 13:30:43.119072 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Mar 21 13:30:43.119086 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Mar 21 13:30:43.121782 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 21 13:30:43.121922 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 13:30:43.124807 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 13:30:43.125792 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 21 13:30:43.125925 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 13:30:43.127544 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 13:30:43.130861 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 13:30:43.134722 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 21 13:30:43.155735 kernel: BTRFS: device fsid c99b4410-5d95-4377-8189-88a588aa2514 devid 1 transid 38 /dev/vda3 scanned by (udev-worker) (466) Mar 21 13:30:43.161723 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (457) Mar 21 13:30:43.196590 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 21 13:30:43.224270 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 13:30:43.243411 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 21 13:30:43.252377 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 21 13:30:43.252986 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 21 13:30:43.266239 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 21 13:30:43.268649 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 21 13:30:43.271876 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 21 13:30:43.293401 disk-uuid[507]: Primary Header is updated. Mar 21 13:30:43.293401 disk-uuid[507]: Secondary Entries is updated. Mar 21 13:30:43.293401 disk-uuid[507]: Secondary Header is updated. Mar 21 13:30:43.303729 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 13:30:43.322991 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 13:30:44.326807 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 21 13:30:44.329807 disk-uuid[508]: The operation has completed successfully. Mar 21 13:30:44.403893 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 21 13:30:44.404146 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 21 13:30:44.460531 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 21 13:30:44.479001 sh[527]: Success Mar 21 13:30:44.496111 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Mar 21 13:30:44.572975 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 21 13:30:44.575790 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 21 13:30:44.589111 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 21 13:30:44.601750 kernel: BTRFS info (device dm-0): first mount of filesystem c99b4410-5d95-4377-8189-88a588aa2514 Mar 21 13:30:44.601821 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 21 13:30:44.601853 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 21 13:30:44.603905 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 21 13:30:44.605545 kernel: BTRFS info (device dm-0): using free space tree Mar 21 13:30:44.620387 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 21 13:30:44.621444 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 21 13:30:44.624822 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 21 13:30:44.627163 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 21 13:30:44.653838 kernel: BTRFS info (device vda6): first mount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 13:30:44.653956 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 21 13:30:44.653988 kernel: BTRFS info (device vda6): using free space tree Mar 21 13:30:44.660772 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 13:30:44.666855 kernel: BTRFS info (device vda6): last unmount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 13:30:44.676193 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 21 13:30:44.677814 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 21 13:30:44.839376 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 21 13:30:44.843811 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 21 13:30:44.853900 ignition[605]: Ignition 2.20.0 Mar 21 13:30:44.853915 ignition[605]: Stage: fetch-offline Mar 21 13:30:44.856295 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 21 13:30:44.853959 ignition[605]: no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:44.853970 ignition[605]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:44.854076 ignition[605]: parsed url from cmdline: "" Mar 21 13:30:44.854080 ignition[605]: no config URL provided Mar 21 13:30:44.854086 ignition[605]: reading system config file "/usr/lib/ignition/user.ign" Mar 21 13:30:44.854094 ignition[605]: no config at "/usr/lib/ignition/user.ign" Mar 21 13:30:44.854099 ignition[605]: failed to fetch config: resource requires networking Mar 21 13:30:44.854366 ignition[605]: Ignition finished successfully Mar 21 13:30:44.882525 systemd-networkd[711]: lo: Link UP Mar 21 13:30:44.882537 systemd-networkd[711]: lo: Gained carrier Mar 21 13:30:44.883803 systemd-networkd[711]: Enumeration completed Mar 21 13:30:44.883883 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 21 13:30:44.884137 systemd-networkd[711]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 13:30:44.884142 systemd-networkd[711]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 21 13:30:44.885091 systemd-networkd[711]: eth0: Link UP Mar 21 13:30:44.885096 systemd-networkd[711]: eth0: Gained carrier Mar 21 13:30:44.885104 systemd-networkd[711]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 13:30:44.885232 systemd[1]: Reached target network.target - Network. Mar 21 13:30:44.888936 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 21 13:30:44.900156 systemd-networkd[711]: eth0: DHCPv4 address 172.24.4.107/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 21 13:30:44.911759 ignition[716]: Ignition 2.20.0 Mar 21 13:30:44.911771 ignition[716]: Stage: fetch Mar 21 13:30:44.911970 ignition[716]: no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:44.911983 ignition[716]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:44.912094 ignition[716]: parsed url from cmdline: "" Mar 21 13:30:44.912098 ignition[716]: no config URL provided Mar 21 13:30:44.912104 ignition[716]: reading system config file "/usr/lib/ignition/user.ign" Mar 21 13:30:44.912114 ignition[716]: no config at "/usr/lib/ignition/user.ign" Mar 21 13:30:44.912217 ignition[716]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 21 13:30:44.912238 ignition[716]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 21 13:30:44.912253 ignition[716]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 21 13:30:45.442167 ignition[716]: GET result: OK Mar 21 13:30:45.442343 ignition[716]: parsing config with SHA512: 9b1ade03a01158d074e7e7627b1efab3719f8b039c82bbeaeb06f0c316b6c25576f7f1a14bc7e7a211c5ac6876b28b247f10c1f0c8ad938f8c8352b379a91bec Mar 21 13:30:45.455674 unknown[716]: fetched base config from "system" Mar 21 13:30:45.455734 unknown[716]: fetched base config from "system" Mar 21 13:30:45.456608 ignition[716]: fetch: fetch complete Mar 21 13:30:45.455750 unknown[716]: fetched user config from "openstack" Mar 21 13:30:45.456621 ignition[716]: fetch: fetch passed Mar 21 13:30:45.460213 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 21 13:30:45.456759 ignition[716]: Ignition finished successfully Mar 21 13:30:45.467031 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 21 13:30:45.516055 ignition[723]: Ignition 2.20.0 Mar 21 13:30:45.516084 ignition[723]: Stage: kargs Mar 21 13:30:45.516427 ignition[723]: no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:45.516452 ignition[723]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:45.520761 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 21 13:30:45.518623 ignition[723]: kargs: kargs passed Mar 21 13:30:45.518751 ignition[723]: Ignition finished successfully Mar 21 13:30:45.525981 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 21 13:30:45.563738 ignition[730]: Ignition 2.20.0 Mar 21 13:30:45.565564 ignition[730]: Stage: disks Mar 21 13:30:45.567108 ignition[730]: no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:45.567136 ignition[730]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:45.572689 ignition[730]: disks: disks passed Mar 21 13:30:45.574087 ignition[730]: Ignition finished successfully Mar 21 13:30:45.577170 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 21 13:30:45.579497 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 21 13:30:45.581502 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 21 13:30:45.584477 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 21 13:30:45.587399 systemd[1]: Reached target sysinit.target - System Initialization. Mar 21 13:30:45.589971 systemd[1]: Reached target basic.target - Basic System. Mar 21 13:30:45.594755 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 21 13:30:45.642495 systemd-fsck[738]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 21 13:30:45.651510 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 21 13:30:45.657100 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 21 13:30:45.812756 kernel: EXT4-fs (vda9): mounted filesystem c540419e-275b-4bd7-8ebd-24b19ec75c0b r/w with ordered data mode. Quota mode: none. Mar 21 13:30:45.815295 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 21 13:30:45.817640 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 21 13:30:45.822086 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 21 13:30:45.832492 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 21 13:30:45.834084 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 21 13:30:45.836085 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 21 13:30:45.839374 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 21 13:30:45.840215 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 21 13:30:45.849783 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (746) Mar 21 13:30:45.849884 kernel: BTRFS info (device vda6): first mount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 13:30:45.853745 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 21 13:30:45.853808 kernel: BTRFS info (device vda6): using free space tree Mar 21 13:30:45.858937 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 21 13:30:45.864870 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 21 13:30:45.872504 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 13:30:45.869633 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 21 13:30:46.037924 initrd-setup-root[774]: cut: /sysroot/etc/passwd: No such file or directory Mar 21 13:30:46.047404 initrd-setup-root[782]: cut: /sysroot/etc/group: No such file or directory Mar 21 13:30:46.055441 initrd-setup-root[789]: cut: /sysroot/etc/shadow: No such file or directory Mar 21 13:30:46.064198 initrd-setup-root[796]: cut: /sysroot/etc/gshadow: No such file or directory Mar 21 13:30:46.108856 systemd-networkd[711]: eth0: Gained IPv6LL Mar 21 13:30:46.181888 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 21 13:30:46.185931 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 21 13:30:46.190345 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 21 13:30:46.200635 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 21 13:30:46.204784 kernel: BTRFS info (device vda6): last unmount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 13:30:46.239346 ignition[864]: INFO : Ignition 2.20.0 Mar 21 13:30:46.239346 ignition[864]: INFO : Stage: mount Mar 21 13:30:46.244020 ignition[864]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:46.244020 ignition[864]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:46.244020 ignition[864]: INFO : mount: mount passed Mar 21 13:30:46.244020 ignition[864]: INFO : Ignition finished successfully Mar 21 13:30:46.241429 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 21 13:30:46.254888 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 21 13:30:53.103139 coreos-metadata[748]: Mar 21 13:30:53.102 WARN failed to locate config-drive, using the metadata service API instead Mar 21 13:30:53.144510 coreos-metadata[748]: Mar 21 13:30:53.144 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 21 13:30:53.161910 coreos-metadata[748]: Mar 21 13:30:53.161 INFO Fetch successful Mar 21 13:30:53.163492 coreos-metadata[748]: Mar 21 13:30:53.163 INFO wrote hostname ci-9999-0-3-4-20a459a426.novalocal to /sysroot/etc/hostname Mar 21 13:30:53.168592 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 21 13:30:53.168930 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 21 13:30:53.176683 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 21 13:30:53.206117 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 21 13:30:53.237796 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (881) Mar 21 13:30:53.246950 kernel: BTRFS info (device vda6): first mount of filesystem 667b391b-b0e4-4f87-a670-43615a660c46 Mar 21 13:30:53.247013 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 21 13:30:53.251188 kernel: BTRFS info (device vda6): using free space tree Mar 21 13:30:53.262791 kernel: BTRFS info (device vda6): auto enabling async discard Mar 21 13:30:53.267597 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 21 13:30:53.315342 ignition[899]: INFO : Ignition 2.20.0 Mar 21 13:30:53.315342 ignition[899]: INFO : Stage: files Mar 21 13:30:53.318313 ignition[899]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:53.318313 ignition[899]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:53.318313 ignition[899]: DEBUG : files: compiled without relabeling support, skipping Mar 21 13:30:53.323845 ignition[899]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 21 13:30:53.323845 ignition[899]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 21 13:30:53.328548 ignition[899]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 21 13:30:53.328548 ignition[899]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 21 13:30:53.328548 ignition[899]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 21 13:30:53.327128 unknown[899]: wrote ssh authorized keys file for user: core Mar 21 13:30:53.336108 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 21 13:30:53.336108 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 21 13:30:53.396006 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 21 13:30:54.070571 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 21 13:30:54.070571 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 21 13:30:54.075351 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 21 13:30:54.755469 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 21 13:30:57.582604 ignition[899]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 21 13:30:57.582604 ignition[899]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 21 13:30:57.587294 ignition[899]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 21 13:30:57.587294 ignition[899]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 21 13:30:57.587294 ignition[899]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 21 13:30:57.587294 ignition[899]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 21 13:30:57.587294 ignition[899]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 21 13:30:57.587294 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 21 13:30:57.603767 ignition[899]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 21 13:30:57.603767 ignition[899]: INFO : files: files passed Mar 21 13:30:57.603767 ignition[899]: INFO : Ignition finished successfully Mar 21 13:30:57.590414 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 21 13:30:57.599829 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 21 13:30:57.602076 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 21 13:30:57.617996 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 21 13:30:57.618098 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 21 13:30:57.627887 initrd-setup-root-after-ignition[933]: grep: /sysroot/etc/flatcar/enabled-sysext.conf Mar 21 13:30:57.629317 initrd-setup-root-after-ignition[929]: grep: Mar 21 13:30:57.629317 initrd-setup-root-after-ignition[933]: : No such file or directory Mar 21 13:30:57.630577 initrd-setup-root-after-ignition[929]: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 21 13:30:57.630577 initrd-setup-root-after-ignition[929]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 21 13:30:57.629871 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 21 13:30:57.632554 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 21 13:30:57.636856 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 21 13:30:57.693599 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 21 13:30:57.693910 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 21 13:30:57.696363 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 21 13:30:57.697822 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 21 13:30:57.699762 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 21 13:30:57.701838 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 21 13:30:57.725787 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 21 13:30:57.730923 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 21 13:30:57.760327 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 21 13:30:57.762076 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 13:30:57.764928 systemd[1]: Stopped target timers.target - Timer Units. Mar 21 13:30:57.767625 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 21 13:30:57.767957 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 21 13:30:57.770766 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 21 13:30:57.772468 systemd[1]: Stopped target basic.target - Basic System. Mar 21 13:30:57.775273 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 21 13:30:57.777604 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 21 13:30:57.779954 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 21 13:30:57.782645 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 21 13:30:57.785293 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 21 13:30:57.788068 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 21 13:30:57.790784 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 21 13:30:57.793531 systemd[1]: Stopped target swap.target - Swaps. Mar 21 13:30:57.795936 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 21 13:30:57.796188 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 21 13:30:57.799053 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 21 13:30:57.800984 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 13:30:57.803355 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 21 13:30:57.804052 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 13:30:57.806218 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 21 13:30:57.806600 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 21 13:30:57.810176 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 21 13:30:57.810479 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 21 13:30:57.813173 systemd[1]: ignition-files.service: Deactivated successfully. Mar 21 13:30:57.813429 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 21 13:30:57.820112 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 21 13:30:57.822521 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 21 13:30:57.824504 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 13:30:57.838046 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 21 13:30:57.839290 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 21 13:30:57.839593 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 13:30:57.846519 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 21 13:30:57.847265 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 21 13:30:57.866139 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 21 13:30:57.866235 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 21 13:30:57.873723 ignition[953]: INFO : Ignition 2.20.0 Mar 21 13:30:57.873723 ignition[953]: INFO : Stage: umount Mar 21 13:30:57.873723 ignition[953]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 21 13:30:57.873723 ignition[953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 21 13:30:57.873723 ignition[953]: INFO : umount: umount passed Mar 21 13:30:57.873723 ignition[953]: INFO : Ignition finished successfully Mar 21 13:30:57.874720 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 21 13:30:57.874828 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 21 13:30:57.877029 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 21 13:30:57.877125 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 21 13:30:57.878490 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 21 13:30:57.878536 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 21 13:30:57.879982 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 21 13:30:57.880026 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 21 13:30:57.881638 systemd[1]: Stopped target network.target - Network. Mar 21 13:30:57.882135 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 21 13:30:57.882186 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 21 13:30:57.883902 systemd[1]: Stopped target paths.target - Path Units. Mar 21 13:30:57.884556 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 21 13:30:57.885297 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 13:30:57.885992 systemd[1]: Stopped target slices.target - Slice Units. Mar 21 13:30:57.888167 systemd[1]: Stopped target sockets.target - Socket Units. Mar 21 13:30:57.890502 systemd[1]: iscsid.socket: Deactivated successfully. Mar 21 13:30:57.890542 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 21 13:30:57.891522 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 21 13:30:57.891553 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 21 13:30:57.892562 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 21 13:30:57.892608 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 21 13:30:57.893776 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 21 13:30:57.893819 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 21 13:30:57.895088 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 21 13:30:57.896253 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 21 13:30:57.898440 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 21 13:30:57.900269 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 21 13:30:57.900366 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 21 13:30:57.904938 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 21 13:30:57.906036 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 21 13:30:57.906281 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 21 13:30:57.910512 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 21 13:30:57.910976 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 21 13:30:57.911122 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 21 13:30:57.914210 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 21 13:30:57.914281 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 21 13:30:57.914977 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 21 13:30:57.915024 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 21 13:30:57.922539 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 21 13:30:57.923632 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 21 13:30:57.923732 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 21 13:30:57.925852 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 21 13:30:57.925904 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 21 13:30:57.927095 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 21 13:30:57.927140 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 21 13:30:57.928393 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 21 13:30:57.928439 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 13:30:57.931170 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 13:30:57.933709 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 21 13:30:57.933791 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 21 13:30:57.943211 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 21 13:30:57.943369 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 13:30:57.944649 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 21 13:30:57.944732 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 21 13:30:57.945393 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 21 13:30:57.945425 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 13:30:57.946651 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 21 13:30:57.946729 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 21 13:30:57.948432 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 21 13:30:57.948478 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 21 13:30:57.949918 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 21 13:30:57.950021 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 21 13:30:57.955950 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 21 13:30:57.958112 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 21 13:30:57.958247 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 13:30:57.959915 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 21 13:30:57.960016 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 13:30:57.964164 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 21 13:30:57.964300 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 21 13:30:57.965070 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 21 13:30:57.965260 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 21 13:30:57.971436 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 21 13:30:57.971544 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 21 13:30:57.973055 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 21 13:30:57.974941 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 21 13:30:57.994241 systemd[1]: Switching root. Mar 21 13:30:58.033188 systemd-journald[183]: Journal stopped Mar 21 13:31:00.016378 systemd-journald[183]: Received SIGTERM from PID 1 (systemd). Mar 21 13:31:00.016439 kernel: SELinux: policy capability network_peer_controls=1 Mar 21 13:31:00.016457 kernel: SELinux: policy capability open_perms=1 Mar 21 13:31:00.016469 kernel: SELinux: policy capability extended_socket_class=1 Mar 21 13:31:00.016480 kernel: SELinux: policy capability always_check_network=0 Mar 21 13:31:00.016494 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 21 13:31:00.016506 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 21 13:31:00.016516 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 21 13:31:00.016528 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 21 13:31:00.016541 kernel: audit: type=1403 audit(1742563858.447:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 21 13:31:00.016554 systemd[1]: Successfully loaded SELinux policy in 79.292ms. Mar 21 13:31:00.016578 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 26.682ms. Mar 21 13:31:00.016592 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 21 13:31:00.016604 systemd[1]: Detected virtualization kvm. Mar 21 13:31:00.016616 systemd[1]: Detected architecture x86-64. Mar 21 13:31:00.016628 systemd[1]: Detected first boot. Mar 21 13:31:00.016640 systemd[1]: Hostname set to . Mar 21 13:31:00.016655 systemd[1]: Initializing machine ID from VM UUID. Mar 21 13:31:00.016667 zram_generator::config[998]: No configuration found. Mar 21 13:31:00.016680 kernel: Guest personality initialized and is inactive Mar 21 13:31:00.025253 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 21 13:31:00.025290 kernel: Initialized host personality Mar 21 13:31:00.025303 kernel: NET: Registered PF_VSOCK protocol family Mar 21 13:31:00.025315 systemd[1]: Populated /etc with preset unit settings. Mar 21 13:31:00.025330 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 21 13:31:00.025348 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 21 13:31:00.025360 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 21 13:31:00.025373 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 21 13:31:00.025385 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 21 13:31:00.025397 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 21 13:31:00.025409 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 21 13:31:00.025421 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 21 13:31:00.025434 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 21 13:31:00.025446 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 21 13:31:00.025464 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 21 13:31:00.025476 systemd[1]: Created slice user.slice - User and Session Slice. Mar 21 13:31:00.025489 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 21 13:31:00.025501 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 21 13:31:00.025513 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 21 13:31:00.025525 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 21 13:31:00.025538 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 21 13:31:00.025552 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 21 13:31:00.025564 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 21 13:31:00.025576 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 21 13:31:00.025588 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 21 13:31:00.025600 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 21 13:31:00.025612 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 21 13:31:00.025624 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 21 13:31:00.025636 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 21 13:31:00.025651 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 21 13:31:00.025664 systemd[1]: Reached target slices.target - Slice Units. Mar 21 13:31:00.025676 systemd[1]: Reached target swap.target - Swaps. Mar 21 13:31:00.025688 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 21 13:31:00.027850 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 21 13:31:00.027868 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 21 13:31:00.027881 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 21 13:31:00.027893 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 21 13:31:00.027905 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 21 13:31:00.027921 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 21 13:31:00.027934 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 21 13:31:00.027957 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 21 13:31:00.028002 systemd[1]: Mounting media.mount - External Media Directory... Mar 21 13:31:00.028048 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 13:31:00.028093 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 21 13:31:00.028133 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 21 13:31:00.028174 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 21 13:31:00.028220 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 21 13:31:00.028271 systemd[1]: Reached target machines.target - Containers. Mar 21 13:31:00.028308 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 21 13:31:00.028321 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 13:31:00.028334 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 21 13:31:00.028346 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 21 13:31:00.028360 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 13:31:00.028372 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 21 13:31:00.028385 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 13:31:00.028400 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 21 13:31:00.028412 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 13:31:00.028424 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 21 13:31:00.028436 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 21 13:31:00.028448 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 21 13:31:00.028460 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 21 13:31:00.028472 systemd[1]: Stopped systemd-fsck-usr.service. Mar 21 13:31:00.028485 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 13:31:00.028499 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 21 13:31:00.028511 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 21 13:31:00.028523 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 21 13:31:00.028536 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 21 13:31:00.028548 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 21 13:31:00.028560 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 21 13:31:00.028574 systemd[1]: verity-setup.service: Deactivated successfully. Mar 21 13:31:00.028586 systemd[1]: Stopped verity-setup.service. Mar 21 13:31:00.028599 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 13:31:00.028611 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 21 13:31:00.028624 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 21 13:31:00.028638 systemd[1]: Mounted media.mount - External Media Directory. Mar 21 13:31:00.028650 kernel: loop: module loaded Mar 21 13:31:00.028662 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 21 13:31:00.028674 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 21 13:31:00.028690 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 21 13:31:00.030795 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 21 13:31:00.030824 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 21 13:31:00.030839 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 21 13:31:00.030861 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 13:31:00.030878 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 13:31:00.030890 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 13:31:00.030903 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 13:31:00.030957 systemd-journald[1085]: Collecting audit messages is disabled. Mar 21 13:31:00.030989 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 13:31:00.031002 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 13:31:00.031015 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 21 13:31:00.031030 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 21 13:31:00.031043 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 21 13:31:00.031056 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 21 13:31:00.031068 kernel: ACPI: bus type drm_connector registered Mar 21 13:31:00.031080 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 21 13:31:00.031094 systemd-journald[1085]: Journal started Mar 21 13:31:00.031119 systemd-journald[1085]: Runtime Journal (/run/log/journal/70ae9356508a4912891abb3abbdadb00) is 8M, max 78.2M, 70.2M free. Mar 21 13:30:59.670182 systemd[1]: Queued start job for default target multi-user.target. Mar 21 13:30:59.678886 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 21 13:30:59.679300 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 21 13:31:00.040827 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 21 13:31:00.040884 systemd[1]: Started systemd-journald.service - Journal Service. Mar 21 13:31:00.046724 kernel: fuse: init (API version 7.39) Mar 21 13:31:00.049191 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 21 13:31:00.049776 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 21 13:31:00.054734 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 21 13:31:00.061036 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 21 13:31:00.068493 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 21 13:31:00.069243 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 21 13:31:00.069280 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 21 13:31:00.071983 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 21 13:31:00.076329 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 21 13:31:00.081359 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 21 13:31:00.082008 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 13:31:00.082994 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 21 13:31:00.086913 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 21 13:31:00.088848 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 21 13:31:00.091000 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 21 13:31:00.092553 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 21 13:31:00.096920 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 21 13:31:00.106014 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 21 13:31:00.110757 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 21 13:31:00.111528 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 21 13:31:00.113165 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 21 13:31:00.114132 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 21 13:31:00.116082 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 21 13:31:00.120010 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 21 13:31:00.129446 systemd-journald[1085]: Time spent on flushing to /var/log/journal/70ae9356508a4912891abb3abbdadb00 is 28.901ms for 961 entries. Mar 21 13:31:00.129446 systemd-journald[1085]: System Journal (/var/log/journal/70ae9356508a4912891abb3abbdadb00) is 8M, max 584.8M, 576.8M free. Mar 21 13:31:00.172881 systemd-journald[1085]: Received client request to flush runtime journal. Mar 21 13:31:00.172923 kernel: loop0: detected capacity change from 0 to 8 Mar 21 13:31:00.133725 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 21 13:31:00.136832 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 21 13:31:00.140884 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 21 13:31:00.145367 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 21 13:31:00.174830 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 21 13:31:00.176573 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 21 13:31:00.206900 udevadm[1147]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 21 13:31:00.216097 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 21 13:31:00.231750 kernel: loop1: detected capacity change from 0 to 151640 Mar 21 13:31:00.237855 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 21 13:31:00.271492 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 21 13:31:00.274855 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 21 13:31:00.314761 systemd-tmpfiles[1159]: ACLs are not supported, ignoring. Mar 21 13:31:00.314780 systemd-tmpfiles[1159]: ACLs are not supported, ignoring. Mar 21 13:31:00.319984 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 21 13:31:00.322724 kernel: loop2: detected capacity change from 0 to 109808 Mar 21 13:31:00.403362 kernel: loop3: detected capacity change from 0 to 210664 Mar 21 13:31:00.470717 kernel: loop4: detected capacity change from 0 to 8 Mar 21 13:31:00.474740 kernel: loop5: detected capacity change from 0 to 151640 Mar 21 13:31:00.568741 kernel: loop6: detected capacity change from 0 to 109808 Mar 21 13:31:00.617060 kernel: loop7: detected capacity change from 0 to 210664 Mar 21 13:31:00.686256 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 21 13:31:00.699906 (sd-merge)[1165]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 21 13:31:00.700439 (sd-merge)[1165]: Merged extensions into '/usr'. Mar 21 13:31:00.716068 systemd[1]: Reload requested from client PID 1136 ('systemd-sysext') (unit systemd-sysext.service)... Mar 21 13:31:00.716237 systemd[1]: Reloading... Mar 21 13:31:00.796918 zram_generator::config[1189]: No configuration found. Mar 21 13:31:01.012011 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 13:31:01.093080 systemd[1]: Reloading finished in 376 ms. Mar 21 13:31:01.107931 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 21 13:31:01.108822 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 21 13:31:01.116882 systemd[1]: Starting ensure-sysext.service... Mar 21 13:31:01.120826 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 21 13:31:01.126834 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 21 13:31:01.163933 systemd-udevd[1251]: Using default interface naming scheme 'v255'. Mar 21 13:31:01.171061 systemd-tmpfiles[1250]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 21 13:31:01.171310 systemd-tmpfiles[1250]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 21 13:31:01.177783 systemd-tmpfiles[1250]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 21 13:31:01.178561 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Mar 21 13:31:01.178776 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Mar 21 13:31:01.183022 systemd[1]: Reload requested from client PID 1249 ('systemctl') (unit ensure-sysext.service)... Mar 21 13:31:01.183062 systemd[1]: Reloading... Mar 21 13:31:01.204239 systemd-tmpfiles[1250]: Detected autofs mount point /boot during canonicalization of boot. Mar 21 13:31:01.204263 systemd-tmpfiles[1250]: Skipping /boot Mar 21 13:31:01.233557 systemd-tmpfiles[1250]: Detected autofs mount point /boot during canonicalization of boot. Mar 21 13:31:01.233569 systemd-tmpfiles[1250]: Skipping /boot Mar 21 13:31:01.264795 ldconfig[1131]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 21 13:31:01.269739 zram_generator::config[1289]: No configuration found. Mar 21 13:31:01.386799 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1259) Mar 21 13:31:01.446832 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 21 13:31:01.474717 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 21 13:31:01.481756 kernel: ACPI: button: Power Button [PWRF] Mar 21 13:31:01.503759 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Mar 21 13:31:01.549038 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 13:31:01.559733 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Mar 21 13:31:01.565256 kernel: mousedev: PS/2 mouse device common for all mice Mar 21 13:31:01.565324 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Mar 21 13:31:01.581322 kernel: Console: switching to colour dummy device 80x25 Mar 21 13:31:01.581395 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 21 13:31:01.581417 kernel: [drm] features: -context_init Mar 21 13:31:01.583716 kernel: [drm] number of scanouts: 1 Mar 21 13:31:01.583762 kernel: [drm] number of cap sets: 0 Mar 21 13:31:01.586712 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 Mar 21 13:31:01.596068 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 21 13:31:01.596161 kernel: Console: switching to colour frame buffer device 160x50 Mar 21 13:31:01.605729 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 21 13:31:01.664124 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 21 13:31:01.664469 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 21 13:31:01.665065 systemd[1]: Reloading finished in 478 ms. Mar 21 13:31:01.676844 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 21 13:31:01.677477 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 21 13:31:01.686273 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 21 13:31:01.734551 systemd[1]: Finished ensure-sysext.service. Mar 21 13:31:01.736539 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 13:31:01.738061 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 13:31:01.744812 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 21 13:31:01.745046 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 21 13:31:01.747880 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 21 13:31:01.758282 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 21 13:31:01.762666 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 21 13:31:01.767000 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 21 13:31:01.767337 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 21 13:31:01.769905 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 21 13:31:01.771523 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 21 13:31:01.776502 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 21 13:31:01.779912 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 21 13:31:01.787241 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 21 13:31:01.793839 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 21 13:31:01.799661 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 21 13:31:01.803878 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 21 13:31:01.803979 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 21 13:31:01.805059 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 21 13:31:01.805929 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 21 13:31:01.806312 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 21 13:31:01.806621 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 21 13:31:01.807079 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 21 13:31:01.807344 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 21 13:31:01.807500 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 21 13:31:01.807775 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 21 13:31:01.807913 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 21 13:31:01.817930 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 21 13:31:01.819670 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 21 13:31:01.821939 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 21 13:31:01.824752 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 21 13:31:01.831179 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 21 13:31:01.867397 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 21 13:31:01.876047 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 21 13:31:01.881580 lvm[1395]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 21 13:31:01.883265 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 21 13:31:01.908180 augenrules[1417]: No rules Mar 21 13:31:01.911218 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 13:31:01.911976 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 13:31:01.920122 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 21 13:31:01.924002 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 21 13:31:01.928679 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 21 13:31:01.934445 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 21 13:31:01.937370 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 21 13:31:01.949030 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 21 13:31:01.953293 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 21 13:31:01.956895 lvm[1425]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 21 13:31:01.997764 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 21 13:31:02.013128 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 21 13:31:02.045421 systemd-resolved[1381]: Positive Trust Anchors: Mar 21 13:31:02.045440 systemd-resolved[1381]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 21 13:31:02.045488 systemd-resolved[1381]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 21 13:31:02.053143 systemd-resolved[1381]: Using system hostname 'ci-9999-0-3-4-20a459a426.novalocal'. Mar 21 13:31:02.054509 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 21 13:31:02.055242 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 21 13:31:02.077359 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 21 13:31:02.077675 systemd-networkd[1380]: lo: Link UP Mar 21 13:31:02.077679 systemd-networkd[1380]: lo: Gained carrier Mar 21 13:31:02.079846 systemd[1]: Reached target sysinit.target - System Initialization. Mar 21 13:31:02.080290 systemd-networkd[1380]: Enumeration completed Mar 21 13:31:02.080373 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 21 13:31:02.080872 systemd-networkd[1380]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 13:31:02.081109 systemd-networkd[1380]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 21 13:31:02.081795 systemd-networkd[1380]: eth0: Link UP Mar 21 13:31:02.082216 systemd-networkd[1380]: eth0: Gained carrier Mar 21 13:31:02.082288 systemd-networkd[1380]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 21 13:31:02.082868 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 21 13:31:02.083114 systemd-timesyncd[1386]: No network connectivity, watching for changes. Mar 21 13:31:02.084820 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 21 13:31:02.087098 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 21 13:31:02.087130 systemd[1]: Reached target paths.target - Path Units. Mar 21 13:31:02.088737 systemd[1]: Reached target time-set.target - System Time Set. Mar 21 13:31:02.090251 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 21 13:31:02.091407 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 21 13:31:02.092807 systemd[1]: Reached target timers.target - Timer Units. Mar 21 13:31:02.095139 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 21 13:31:02.096564 systemd-networkd[1380]: eth0: DHCPv4 address 172.24.4.107/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 21 13:31:02.098967 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 21 13:31:02.103275 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 21 13:31:02.104333 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Mar 21 13:31:02.105961 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 21 13:31:02.107128 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 21 13:31:02.111136 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 21 13:31:02.117144 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 21 13:31:02.118388 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 21 13:31:02.121681 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 21 13:31:02.123000 systemd[1]: Reached target network.target - Network. Mar 21 13:31:02.124233 systemd[1]: Reached target sockets.target - Socket Units. Mar 21 13:31:02.125387 systemd[1]: Reached target basic.target - Basic System. Mar 21 13:31:02.126821 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 21 13:31:02.126930 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 21 13:31:02.128018 systemd[1]: Starting containerd.service - containerd container runtime... Mar 21 13:31:02.136912 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 21 13:31:02.142260 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 21 13:31:02.148810 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 21 13:31:02.153441 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 21 13:31:02.154127 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 21 13:31:02.159474 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 21 13:31:02.166448 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 21 13:31:02.174919 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 21 13:31:02.180386 jq[1445]: false Mar 21 13:31:02.187998 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 21 13:31:02.196592 extend-filesystems[1448]: Found loop4 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found loop5 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found loop6 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found loop7 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda1 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda2 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda3 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found usr Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda4 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda6 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda7 Mar 21 13:31:02.204498 extend-filesystems[1448]: Found vda9 Mar 21 13:31:02.204498 extend-filesystems[1448]: Checking size of /dev/vda9 Mar 21 13:31:02.224502 dbus-daemon[1444]: [system] SELinux support is enabled Mar 21 13:31:02.206999 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 21 13:31:02.220889 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 21 13:31:02.230964 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 21 13:31:02.240903 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 21 13:31:02.241623 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 21 13:31:03.019390 extend-filesystems[1448]: Resized partition /dev/vda9 Mar 21 13:31:03.042580 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1307) Mar 21 13:31:03.042607 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Mar 21 13:31:03.011530 systemd-resolved[1381]: Clock change detected. Flushing caches. Mar 21 13:31:03.042691 extend-filesystems[1472]: resize2fs 1.47.2 (1-Jan-2025) Mar 21 13:31:03.011647 systemd-timesyncd[1386]: Contacted time server 144.202.62.209:123 (1.flatcar.pool.ntp.org). Mar 21 13:31:03.011708 systemd-timesyncd[1386]: Initial clock synchronization to Fri 2025-03-21 13:31:03.011480 UTC. Mar 21 13:31:03.012037 systemd[1]: Starting update-engine.service - Update Engine... Mar 21 13:31:03.019665 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 21 13:31:03.032564 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 21 13:31:03.063483 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Mar 21 13:31:03.059724 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 21 13:31:03.059950 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 21 13:31:03.131426 jq[1471]: true Mar 21 13:31:03.060224 systemd[1]: motdgen.service: Deactivated successfully. Mar 21 13:31:03.132700 update_engine[1470]: I20250321 13:31:03.092262 1470 main.cc:92] Flatcar Update Engine starting Mar 21 13:31:03.132700 update_engine[1470]: I20250321 13:31:03.109602 1470 update_check_scheduler.cc:74] Next update check in 8m50s Mar 21 13:31:03.060382 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 21 13:31:03.065785 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 21 13:31:03.065964 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 21 13:31:03.090030 (ntainerd)[1480]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 21 13:31:03.134326 jq[1482]: true Mar 21 13:31:03.101310 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 21 13:31:03.136864 extend-filesystems[1472]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 21 13:31:03.136864 extend-filesystems[1472]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 21 13:31:03.136864 extend-filesystems[1472]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Mar 21 13:31:03.101337 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 21 13:31:03.139234 extend-filesystems[1448]: Resized filesystem in /dev/vda9 Mar 21 13:31:03.104473 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 21 13:31:03.104500 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 21 13:31:03.111590 systemd[1]: Started update-engine.service - Update Engine. Mar 21 13:31:03.115601 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 21 13:31:03.141664 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 21 13:31:03.142077 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 21 13:31:03.169917 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 21 13:31:03.177558 tar[1474]: linux-amd64/helm Mar 21 13:31:03.181573 systemd-logind[1460]: New seat seat0. Mar 21 13:31:03.184415 systemd-logind[1460]: Watching system buttons on /dev/input/event2 (Power Button) Mar 21 13:31:03.184502 systemd-logind[1460]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 21 13:31:03.184676 systemd[1]: Started systemd-logind.service - User Login Management. Mar 21 13:31:03.285178 locksmithd[1483]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 21 13:31:03.297453 bash[1505]: Updated "/home/core/.ssh/authorized_keys" Mar 21 13:31:03.298784 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 21 13:31:03.312912 systemd[1]: Starting sshkeys.service... Mar 21 13:31:03.345089 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 21 13:31:03.351365 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 21 13:31:03.615359 containerd[1480]: time="2025-03-21T13:31:03Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 21 13:31:03.616152 containerd[1480]: time="2025-03-21T13:31:03.616117291Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642385317Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.39µs" Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642467140Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642502747Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642756673Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642778875Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642806467Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642884944Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.642910762Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.643155171Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.643179046Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.643191258Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 21 13:31:03.642935 containerd[1480]: time="2025-03-21T13:31:03.643201007Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 21 13:31:03.645276 containerd[1480]: time="2025-03-21T13:31:03.643280105Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 21 13:31:03.647672 containerd[1480]: time="2025-03-21T13:31:03.645513394Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 21 13:31:03.647672 containerd[1480]: time="2025-03-21T13:31:03.645572545Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 21 13:31:03.647672 containerd[1480]: time="2025-03-21T13:31:03.645586731Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 21 13:31:03.647672 containerd[1480]: time="2025-03-21T13:31:03.645630944Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 21 13:31:03.647672 containerd[1480]: time="2025-03-21T13:31:03.645916440Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 21 13:31:03.648527 containerd[1480]: time="2025-03-21T13:31:03.648499965Z" level=info msg="metadata content store policy set" policy=shared Mar 21 13:31:03.660258 containerd[1480]: time="2025-03-21T13:31:03.660231497Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 21 13:31:03.660311 containerd[1480]: time="2025-03-21T13:31:03.660282703Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 21 13:31:03.660311 containerd[1480]: time="2025-03-21T13:31:03.660299465Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 21 13:31:03.660365 containerd[1480]: time="2025-03-21T13:31:03.660324151Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 21 13:31:03.660394 containerd[1480]: time="2025-03-21T13:31:03.660374065Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 21 13:31:03.660394 containerd[1480]: time="2025-03-21T13:31:03.660386979Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 21 13:31:03.660449 containerd[1480]: time="2025-03-21T13:31:03.660401065Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 21 13:31:03.660449 containerd[1480]: time="2025-03-21T13:31:03.660415452Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 21 13:31:03.660449 containerd[1480]: time="2025-03-21T13:31:03.660429970Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 21 13:31:03.660519 containerd[1480]: time="2025-03-21T13:31:03.660457932Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 21 13:31:03.660519 containerd[1480]: time="2025-03-21T13:31:03.660470115Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 21 13:31:03.660519 containerd[1480]: time="2025-03-21T13:31:03.660483119Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 21 13:31:03.660602 containerd[1480]: time="2025-03-21T13:31:03.660577366Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 21 13:31:03.660631 containerd[1480]: time="2025-03-21T13:31:03.660608655Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 21 13:31:03.660660 containerd[1480]: time="2025-03-21T13:31:03.660628903Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 21 13:31:03.660660 containerd[1480]: time="2025-03-21T13:31:03.660642298Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 21 13:31:03.660700 containerd[1480]: time="2025-03-21T13:31:03.660658388Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 21 13:31:03.660700 containerd[1480]: time="2025-03-21T13:31:03.660670180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 21 13:31:03.660700 containerd[1480]: time="2025-03-21T13:31:03.660681732Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 21 13:31:03.660700 containerd[1480]: time="2025-03-21T13:31:03.660692763Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 21 13:31:03.660784 containerd[1480]: time="2025-03-21T13:31:03.660704855Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 21 13:31:03.660784 containerd[1480]: time="2025-03-21T13:31:03.660717629Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 21 13:31:03.660784 containerd[1480]: time="2025-03-21T13:31:03.660730103Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 21 13:31:03.660846 containerd[1480]: time="2025-03-21T13:31:03.660787851Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 21 13:31:03.660846 containerd[1480]: time="2025-03-21T13:31:03.660803550Z" level=info msg="Start snapshots syncer" Mar 21 13:31:03.660894 containerd[1480]: time="2025-03-21T13:31:03.660843555Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 21 13:31:03.661553 containerd[1480]: time="2025-03-21T13:31:03.661141304Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 21 13:31:03.661553 containerd[1480]: time="2025-03-21T13:31:03.661202408Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661266358Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661347901Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661370564Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661383077Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661394659Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661407052Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661418223Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661454872Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661483846Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661503313Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661514654Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661543137Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661557164Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 21 13:31:03.661707 containerd[1480]: time="2025-03-21T13:31:03.661567183Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661578163Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661587290Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661597499Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661609171Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661624159Z" level=info msg="runtime interface created" Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661629990Z" level=info msg="created NRI interface" Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661638897Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661649627Z" level=info msg="Connect containerd service" Mar 21 13:31:03.662016 containerd[1480]: time="2025-03-21T13:31:03.661674664Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 21 13:31:03.665467 containerd[1480]: time="2025-03-21T13:31:03.664772143Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 21 13:31:03.804207 sshd_keygen[1469]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 21 13:31:03.831578 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 21 13:31:03.835794 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 21 13:31:03.857007 systemd[1]: issuegen.service: Deactivated successfully. Mar 21 13:31:03.857174 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 21 13:31:03.866786 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 21 13:31:03.886007 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 21 13:31:03.891763 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 21 13:31:03.895431 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 21 13:31:03.895616 tar[1474]: linux-amd64/LICENSE Mar 21 13:31:03.895757 tar[1474]: linux-amd64/README.md Mar 21 13:31:03.897056 systemd[1]: Reached target getty.target - Login Prompts. Mar 21 13:31:03.917311 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 21 13:31:03.927175 containerd[1480]: time="2025-03-21T13:31:03.927134634Z" level=info msg="Start subscribing containerd event" Mar 21 13:31:03.927321 containerd[1480]: time="2025-03-21T13:31:03.927288322Z" level=info msg="Start recovering state" Mar 21 13:31:03.927471 containerd[1480]: time="2025-03-21T13:31:03.927154572Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 21 13:31:03.927658 containerd[1480]: time="2025-03-21T13:31:03.927642186Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 21 13:31:03.927797 containerd[1480]: time="2025-03-21T13:31:03.927758334Z" level=info msg="Start event monitor" Mar 21 13:31:03.928376 containerd[1480]: time="2025-03-21T13:31:03.928344383Z" level=info msg="Start cni network conf syncer for default" Mar 21 13:31:03.928376 containerd[1480]: time="2025-03-21T13:31:03.928370943Z" level=info msg="Start streaming server" Mar 21 13:31:03.928431 containerd[1480]: time="2025-03-21T13:31:03.928384428Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 21 13:31:03.928431 containerd[1480]: time="2025-03-21T13:31:03.928393986Z" level=info msg="runtime interface starting up..." Mar 21 13:31:03.928431 containerd[1480]: time="2025-03-21T13:31:03.928413192Z" level=info msg="starting plugins..." Mar 21 13:31:03.928431 containerd[1480]: time="2025-03-21T13:31:03.928449120Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 21 13:31:03.928619 containerd[1480]: time="2025-03-21T13:31:03.928595264Z" level=info msg="containerd successfully booted in 0.313594s" Mar 21 13:31:03.928825 systemd[1]: Started containerd.service - containerd container runtime. Mar 21 13:31:04.850730 systemd-networkd[1380]: eth0: Gained IPv6LL Mar 21 13:31:04.855303 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 21 13:31:04.861433 systemd[1]: Reached target network-online.target - Network is Online. Mar 21 13:31:04.870481 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:04.880229 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 21 13:31:04.934373 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 21 13:31:06.889736 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:06.905322 (kubelet)[1575]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 13:31:08.340722 kubelet[1575]: E0321 13:31:08.340566 1575 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 13:31:08.344913 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 13:31:08.345239 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 13:31:08.346160 systemd[1]: kubelet.service: Consumed 2.344s CPU time, 246.8M memory peak. Mar 21 13:31:08.764872 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 21 13:31:08.770634 systemd[1]: Started sshd@0-172.24.4.107:22-172.24.4.1:58492.service - OpenSSH per-connection server daemon (172.24.4.1:58492). Mar 21 13:31:09.004322 login[1545]: pam_lastlog(login:session): file /var/log/lastlog is locked/read, retrying Mar 21 13:31:09.006182 login[1544]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 21 13:31:09.038900 systemd-logind[1460]: New session 1 of user core. Mar 21 13:31:09.042317 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 21 13:31:09.047262 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 21 13:31:09.088292 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 21 13:31:09.094232 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 21 13:31:09.121853 (systemd)[1592]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 21 13:31:09.128000 systemd-logind[1460]: New session c1 of user core. Mar 21 13:31:09.325996 systemd[1592]: Queued start job for default target default.target. Mar 21 13:31:09.333323 systemd[1592]: Created slice app.slice - User Application Slice. Mar 21 13:31:09.333352 systemd[1592]: Reached target paths.target - Paths. Mar 21 13:31:09.333390 systemd[1592]: Reached target timers.target - Timers. Mar 21 13:31:09.336524 systemd[1592]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 21 13:31:09.344099 systemd[1592]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 21 13:31:09.344999 systemd[1592]: Reached target sockets.target - Sockets. Mar 21 13:31:09.345045 systemd[1592]: Reached target basic.target - Basic System. Mar 21 13:31:09.345084 systemd[1592]: Reached target default.target - Main User Target. Mar 21 13:31:09.345109 systemd[1592]: Startup finished in 203ms. Mar 21 13:31:09.345781 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 21 13:31:09.354887 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 21 13:31:09.980615 coreos-metadata[1443]: Mar 21 13:31:09.980 WARN failed to locate config-drive, using the metadata service API instead Mar 21 13:31:10.011365 login[1545]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 21 13:31:10.022413 systemd-logind[1460]: New session 2 of user core. Mar 21 13:31:10.033976 coreos-metadata[1443]: Mar 21 13:31:10.033 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 21 13:31:10.035919 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 21 13:31:10.310999 sshd[1584]: Accepted publickey for core from 172.24.4.1 port 58492 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:10.314286 sshd-session[1584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:10.322945 coreos-metadata[1443]: Mar 21 13:31:10.322 INFO Fetch successful Mar 21 13:31:10.322945 coreos-metadata[1443]: Mar 21 13:31:10.322 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 21 13:31:10.325277 systemd-logind[1460]: New session 3 of user core. Mar 21 13:31:10.334950 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 21 13:31:10.336043 coreos-metadata[1443]: Mar 21 13:31:10.335 INFO Fetch successful Mar 21 13:31:10.336043 coreos-metadata[1443]: Mar 21 13:31:10.335 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 21 13:31:10.350128 coreos-metadata[1443]: Mar 21 13:31:10.349 INFO Fetch successful Mar 21 13:31:10.350128 coreos-metadata[1443]: Mar 21 13:31:10.350 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 21 13:31:10.364945 coreos-metadata[1443]: Mar 21 13:31:10.364 INFO Fetch successful Mar 21 13:31:10.364945 coreos-metadata[1443]: Mar 21 13:31:10.364 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 21 13:31:10.380910 coreos-metadata[1443]: Mar 21 13:31:10.380 INFO Fetch successful Mar 21 13:31:10.380910 coreos-metadata[1443]: Mar 21 13:31:10.380 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 21 13:31:10.391923 coreos-metadata[1443]: Mar 21 13:31:10.391 INFO Fetch successful Mar 21 13:31:10.444618 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 21 13:31:10.448355 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 21 13:31:10.461856 coreos-metadata[1514]: Mar 21 13:31:10.461 WARN failed to locate config-drive, using the metadata service API instead Mar 21 13:31:10.501308 coreos-metadata[1514]: Mar 21 13:31:10.501 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 21 13:31:10.518039 coreos-metadata[1514]: Mar 21 13:31:10.517 INFO Fetch successful Mar 21 13:31:10.518039 coreos-metadata[1514]: Mar 21 13:31:10.518 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 21 13:31:10.534503 coreos-metadata[1514]: Mar 21 13:31:10.534 INFO Fetch successful Mar 21 13:31:10.540505 unknown[1514]: wrote ssh authorized keys file for user: core Mar 21 13:31:10.586883 update-ssh-keys[1630]: Updated "/home/core/.ssh/authorized_keys" Mar 21 13:31:10.588587 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 21 13:31:10.590911 systemd[1]: Finished sshkeys.service. Mar 21 13:31:10.597001 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 21 13:31:10.598076 systemd[1]: Startup finished in 1.112s (kernel) + 16.690s (initrd) + 11.469s (userspace) = 29.272s. Mar 21 13:31:11.045543 systemd[1]: Started sshd@1-172.24.4.107:22-172.24.4.1:58500.service - OpenSSH per-connection server daemon (172.24.4.1:58500). Mar 21 13:31:12.177054 sshd[1636]: Accepted publickey for core from 172.24.4.1 port 58500 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:12.179732 sshd-session[1636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:12.193402 systemd-logind[1460]: New session 4 of user core. Mar 21 13:31:12.199848 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 21 13:31:12.925483 sshd[1638]: Connection closed by 172.24.4.1 port 58500 Mar 21 13:31:12.925803 sshd-session[1636]: pam_unix(sshd:session): session closed for user core Mar 21 13:31:12.942191 systemd[1]: sshd@1-172.24.4.107:22-172.24.4.1:58500.service: Deactivated successfully. Mar 21 13:31:12.945294 systemd[1]: session-4.scope: Deactivated successfully. Mar 21 13:31:12.948045 systemd-logind[1460]: Session 4 logged out. Waiting for processes to exit. Mar 21 13:31:12.951868 systemd[1]: Started sshd@2-172.24.4.107:22-172.24.4.1:58504.service - OpenSSH per-connection server daemon (172.24.4.1:58504). Mar 21 13:31:12.955043 systemd-logind[1460]: Removed session 4. Mar 21 13:31:14.114493 sshd[1643]: Accepted publickey for core from 172.24.4.1 port 58504 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:14.117653 sshd-session[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:14.130072 systemd-logind[1460]: New session 5 of user core. Mar 21 13:31:14.139815 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 21 13:31:14.742481 sshd[1646]: Connection closed by 172.24.4.1 port 58504 Mar 21 13:31:14.743480 sshd-session[1643]: pam_unix(sshd:session): session closed for user core Mar 21 13:31:14.756728 systemd[1]: sshd@2-172.24.4.107:22-172.24.4.1:58504.service: Deactivated successfully. Mar 21 13:31:14.760070 systemd[1]: session-5.scope: Deactivated successfully. Mar 21 13:31:14.762099 systemd-logind[1460]: Session 5 logged out. Waiting for processes to exit. Mar 21 13:31:14.766577 systemd[1]: Started sshd@3-172.24.4.107:22-172.24.4.1:47850.service - OpenSSH per-connection server daemon (172.24.4.1:47850). Mar 21 13:31:14.769342 systemd-logind[1460]: Removed session 5. Mar 21 13:31:15.913963 sshd[1651]: Accepted publickey for core from 172.24.4.1 port 47850 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:15.916533 sshd-session[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:15.926934 systemd-logind[1460]: New session 6 of user core. Mar 21 13:31:15.935728 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 21 13:31:16.712807 sshd[1654]: Connection closed by 172.24.4.1 port 47850 Mar 21 13:31:16.713879 sshd-session[1651]: pam_unix(sshd:session): session closed for user core Mar 21 13:31:16.732145 systemd[1]: sshd@3-172.24.4.107:22-172.24.4.1:47850.service: Deactivated successfully. Mar 21 13:31:16.736234 systemd[1]: session-6.scope: Deactivated successfully. Mar 21 13:31:16.738783 systemd-logind[1460]: Session 6 logged out. Waiting for processes to exit. Mar 21 13:31:16.743105 systemd[1]: Started sshd@4-172.24.4.107:22-172.24.4.1:47858.service - OpenSSH per-connection server daemon (172.24.4.1:47858). Mar 21 13:31:16.746579 systemd-logind[1460]: Removed session 6. Mar 21 13:31:17.901693 sshd[1659]: Accepted publickey for core from 172.24.4.1 port 47858 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:17.904392 sshd-session[1659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:17.915944 systemd-logind[1460]: New session 7 of user core. Mar 21 13:31:17.923750 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 21 13:31:18.350025 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 21 13:31:18.353176 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:18.422157 sudo[1664]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 21 13:31:18.422873 sudo[1664]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 13:31:18.456817 sudo[1664]: pam_unix(sudo:session): session closed for user root Mar 21 13:31:18.677217 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:18.691213 sshd[1662]: Connection closed by 172.24.4.1 port 47858 Mar 21 13:31:18.690880 sshd-session[1659]: pam_unix(sshd:session): session closed for user core Mar 21 13:31:18.695018 (kubelet)[1673]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 13:31:18.705788 systemd[1]: sshd@4-172.24.4.107:22-172.24.4.1:47858.service: Deactivated successfully. Mar 21 13:31:18.710392 systemd[1]: session-7.scope: Deactivated successfully. Mar 21 13:31:18.714134 systemd-logind[1460]: Session 7 logged out. Waiting for processes to exit. Mar 21 13:31:18.717299 systemd[1]: Started sshd@5-172.24.4.107:22-172.24.4.1:47870.service - OpenSSH per-connection server daemon (172.24.4.1:47870). Mar 21 13:31:18.724988 systemd-logind[1460]: Removed session 7. Mar 21 13:31:18.791589 kubelet[1673]: E0321 13:31:18.791390 1673 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 13:31:18.794492 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 13:31:18.794631 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 13:31:18.794955 systemd[1]: kubelet.service: Consumed 275ms CPU time, 95.7M memory peak. Mar 21 13:31:20.198510 sshd[1681]: Accepted publickey for core from 172.24.4.1 port 47870 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:20.201395 sshd-session[1681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:20.213408 systemd-logind[1460]: New session 8 of user core. Mar 21 13:31:20.223811 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 21 13:31:20.756408 sudo[1688]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 21 13:31:20.757160 sudo[1688]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 13:31:20.764354 sudo[1688]: pam_unix(sudo:session): session closed for user root Mar 21 13:31:20.775893 sudo[1687]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 21 13:31:20.776556 sudo[1687]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 13:31:20.798059 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 21 13:31:20.876464 augenrules[1710]: No rules Mar 21 13:31:20.877853 systemd[1]: audit-rules.service: Deactivated successfully. Mar 21 13:31:20.878383 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 21 13:31:20.881116 sudo[1687]: pam_unix(sudo:session): session closed for user root Mar 21 13:31:21.065655 sshd[1686]: Connection closed by 172.24.4.1 port 47870 Mar 21 13:31:21.067228 sshd-session[1681]: pam_unix(sshd:session): session closed for user core Mar 21 13:31:21.084097 systemd[1]: sshd@5-172.24.4.107:22-172.24.4.1:47870.service: Deactivated successfully. Mar 21 13:31:21.088292 systemd[1]: session-8.scope: Deactivated successfully. Mar 21 13:31:21.092613 systemd-logind[1460]: Session 8 logged out. Waiting for processes to exit. Mar 21 13:31:21.094832 systemd[1]: Started sshd@6-172.24.4.107:22-172.24.4.1:47878.service - OpenSSH per-connection server daemon (172.24.4.1:47878). Mar 21 13:31:21.098085 systemd-logind[1460]: Removed session 8. Mar 21 13:31:22.401251 sshd[1718]: Accepted publickey for core from 172.24.4.1 port 47878 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:31:22.403038 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:31:22.414569 systemd-logind[1460]: New session 9 of user core. Mar 21 13:31:22.429771 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 21 13:31:22.858273 sudo[1722]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 21 13:31:22.861849 sudo[1722]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 21 13:31:23.533227 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 21 13:31:23.546936 (dockerd)[1740]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 21 13:31:24.128594 dockerd[1740]: time="2025-03-21T13:31:24.128524195Z" level=info msg="Starting up" Mar 21 13:31:24.130478 dockerd[1740]: time="2025-03-21T13:31:24.130420291Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 21 13:31:24.177420 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2338070992-merged.mount: Deactivated successfully. Mar 21 13:31:24.255777 dockerd[1740]: time="2025-03-21T13:31:24.255693141Z" level=info msg="Loading containers: start." Mar 21 13:31:24.480528 kernel: Initializing XFRM netlink socket Mar 21 13:31:24.618671 systemd-networkd[1380]: docker0: Link UP Mar 21 13:31:24.687531 dockerd[1740]: time="2025-03-21T13:31:24.686859182Z" level=info msg="Loading containers: done." Mar 21 13:31:24.714496 dockerd[1740]: time="2025-03-21T13:31:24.713709660Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 21 13:31:24.714496 dockerd[1740]: time="2025-03-21T13:31:24.713875782Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 21 13:31:24.714496 dockerd[1740]: time="2025-03-21T13:31:24.714092929Z" level=info msg="Daemon has completed initialization" Mar 21 13:31:24.784428 dockerd[1740]: time="2025-03-21T13:31:24.784164589Z" level=info msg="API listen on /run/docker.sock" Mar 21 13:31:24.784295 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 21 13:31:26.460531 containerd[1480]: time="2025-03-21T13:31:26.459502585Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 21 13:31:27.240148 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1647037691.mount: Deactivated successfully. Mar 21 13:31:28.849200 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 21 13:31:28.853818 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:28.979974 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:28.989674 (kubelet)[2008]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 13:31:29.332882 kubelet[2008]: E0321 13:31:29.332660 2008 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 13:31:29.337658 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 13:31:29.337994 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 13:31:29.338840 systemd[1]: kubelet.service: Consumed 161ms CPU time, 96.1M memory peak. Mar 21 13:31:29.386770 containerd[1480]: time="2025-03-21T13:31:29.386160240Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:29.409799 containerd[1480]: time="2025-03-21T13:31:29.409665885Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674581" Mar 21 13:31:29.420009 containerd[1480]: time="2025-03-21T13:31:29.419865124Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:29.443808 containerd[1480]: time="2025-03-21T13:31:29.443566746Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:29.447178 containerd[1480]: time="2025-03-21T13:31:29.446195546Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 2.986609626s" Mar 21 13:31:29.447178 containerd[1480]: time="2025-03-21T13:31:29.446274525Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 21 13:31:29.486238 containerd[1480]: time="2025-03-21T13:31:29.486089678Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 21 13:31:32.204520 containerd[1480]: time="2025-03-21T13:31:32.204316572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:32.220578 containerd[1480]: time="2025-03-21T13:31:32.220415305Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619780" Mar 21 13:31:32.235847 containerd[1480]: time="2025-03-21T13:31:32.235708447Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:32.256633 containerd[1480]: time="2025-03-21T13:31:32.256519228Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:32.259698 containerd[1480]: time="2025-03-21T13:31:32.259416211Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 2.773251332s" Mar 21 13:31:32.259698 containerd[1480]: time="2025-03-21T13:31:32.259539192Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 21 13:31:32.298402 containerd[1480]: time="2025-03-21T13:31:32.298341115Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 21 13:31:33.843218 containerd[1480]: time="2025-03-21T13:31:33.843055907Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:33.844222 containerd[1480]: time="2025-03-21T13:31:33.844172361Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903317" Mar 21 13:31:33.845652 containerd[1480]: time="2025-03-21T13:31:33.845609747Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:33.848618 containerd[1480]: time="2025-03-21T13:31:33.848577132Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:33.849632 containerd[1480]: time="2025-03-21T13:31:33.849480647Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.550714104s" Mar 21 13:31:33.849632 containerd[1480]: time="2025-03-21T13:31:33.849519860Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 21 13:31:33.866777 containerd[1480]: time="2025-03-21T13:31:33.866713637Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 21 13:31:35.217677 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1233452030.mount: Deactivated successfully. Mar 21 13:31:35.709678 containerd[1480]: time="2025-03-21T13:31:35.709638262Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:35.711072 containerd[1480]: time="2025-03-21T13:31:35.711031374Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185380" Mar 21 13:31:35.712426 containerd[1480]: time="2025-03-21T13:31:35.712402075Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:35.715005 containerd[1480]: time="2025-03-21T13:31:35.714907334Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:35.715459 containerd[1480]: time="2025-03-21T13:31:35.715408884Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 1.848404252s" Mar 21 13:31:35.715515 containerd[1480]: time="2025-03-21T13:31:35.715458808Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 21 13:31:35.735090 containerd[1480]: time="2025-03-21T13:31:35.735056644Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 21 13:31:36.371721 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount591261881.mount: Deactivated successfully. Mar 21 13:31:38.072769 containerd[1480]: time="2025-03-21T13:31:38.072682841Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:38.074491 containerd[1480]: time="2025-03-21T13:31:38.074418174Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Mar 21 13:31:38.075964 containerd[1480]: time="2025-03-21T13:31:38.075914836Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:38.079343 containerd[1480]: time="2025-03-21T13:31:38.079297250Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:38.082682 containerd[1480]: time="2025-03-21T13:31:38.080667716Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.345567232s" Mar 21 13:31:38.082682 containerd[1480]: time="2025-03-21T13:31:38.080712373Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 21 13:31:38.101196 containerd[1480]: time="2025-03-21T13:31:38.101164835Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 21 13:31:38.725759 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount220463397.mount: Deactivated successfully. Mar 21 13:31:38.738693 containerd[1480]: time="2025-03-21T13:31:38.738554788Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:38.740522 containerd[1480]: time="2025-03-21T13:31:38.740202510Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Mar 21 13:31:38.742328 containerd[1480]: time="2025-03-21T13:31:38.742223844Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:38.748159 containerd[1480]: time="2025-03-21T13:31:38.748025215Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:38.750899 containerd[1480]: time="2025-03-21T13:31:38.749993027Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 648.64639ms" Mar 21 13:31:38.750899 containerd[1480]: time="2025-03-21T13:31:38.750077182Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 21 13:31:38.791398 containerd[1480]: time="2025-03-21T13:31:38.791305275Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 21 13:31:39.350155 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 21 13:31:39.355680 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:39.529707 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:39.538696 (kubelet)[2133]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 21 13:31:39.590617 kubelet[2133]: E0321 13:31:39.590527 2133 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 21 13:31:39.594037 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 21 13:31:39.594333 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 21 13:31:39.594959 systemd[1]: kubelet.service: Consumed 193ms CPU time, 97.6M memory peak. Mar 21 13:31:39.796740 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4209112579.mount: Deactivated successfully. Mar 21 13:31:42.784101 containerd[1480]: time="2025-03-21T13:31:42.784006971Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:42.785701 containerd[1480]: time="2025-03-21T13:31:42.785534381Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" Mar 21 13:31:42.787070 containerd[1480]: time="2025-03-21T13:31:42.787012304Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:42.790033 containerd[1480]: time="2025-03-21T13:31:42.789989438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:31:42.791397 containerd[1480]: time="2025-03-21T13:31:42.791076595Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 3.999710926s" Mar 21 13:31:42.791397 containerd[1480]: time="2025-03-21T13:31:42.791117406Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 21 13:31:47.118052 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:47.118519 systemd[1]: kubelet.service: Consumed 193ms CPU time, 97.6M memory peak. Mar 21 13:31:47.122692 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:47.150480 systemd[1]: Reload requested from client PID 2270 ('systemctl') (unit session-9.scope)... Mar 21 13:31:47.150495 systemd[1]: Reloading... Mar 21 13:31:47.252469 zram_generator::config[2316]: No configuration found. Mar 21 13:31:47.408923 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 13:31:47.526349 systemd[1]: Reloading finished in 375 ms. Mar 21 13:31:47.566472 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 21 13:31:47.566688 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 21 13:31:47.566991 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:47.567027 systemd[1]: kubelet.service: Consumed 117ms CPU time, 83.5M memory peak. Mar 21 13:31:47.568663 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:47.694926 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:47.701985 (kubelet)[2380]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 21 13:31:47.959041 kubelet[2380]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 13:31:47.959041 kubelet[2380]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 13:31:47.959041 kubelet[2380]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 13:31:47.966911 kubelet[2380]: I0321 13:31:47.961320 2380 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 13:31:48.350723 update_engine[1470]: I20250321 13:31:48.350039 1470 update_attempter.cc:509] Updating boot flags... Mar 21 13:31:48.465937 kubelet[2380]: I0321 13:31:48.465883 2380 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 21 13:31:48.466119 kubelet[2380]: I0321 13:31:48.466107 2380 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 13:31:48.466410 kubelet[2380]: I0321 13:31:48.466395 2380 server.go:927] "Client rotation is on, will bootstrap in background" Mar 21 13:31:48.878566 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2395) Mar 21 13:31:49.096625 kubelet[2380]: E0321 13:31:49.096585 2380 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.099609 kubelet[2380]: I0321 13:31:49.098185 2380 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 21 13:31:49.130755 kubelet[2380]: I0321 13:31:49.130680 2380 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 21 13:31:49.132088 kubelet[2380]: I0321 13:31:49.131654 2380 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 13:31:49.132088 kubelet[2380]: I0321 13:31:49.131684 2380 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-3-4-20a459a426.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 21 13:31:49.132088 kubelet[2380]: I0321 13:31:49.131881 2380 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 13:31:49.132088 kubelet[2380]: I0321 13:31:49.131894 2380 container_manager_linux.go:301] "Creating device plugin manager" Mar 21 13:31:49.132322 kubelet[2380]: I0321 13:31:49.131996 2380 state_mem.go:36] "Initialized new in-memory state store" Mar 21 13:31:49.134490 kubelet[2380]: I0321 13:31:49.133579 2380 kubelet.go:400] "Attempting to sync node with API server" Mar 21 13:31:49.135473 kubelet[2380]: I0321 13:31:49.134555 2380 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 13:31:49.135473 kubelet[2380]: I0321 13:31:49.134581 2380 kubelet.go:312] "Adding apiserver pod source" Mar 21 13:31:49.135473 kubelet[2380]: I0321 13:31:49.134593 2380 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 13:31:49.146850 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2398) Mar 21 13:31:49.149377 kubelet[2380]: W0321 13:31:49.149143 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-4-20a459a426.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.149377 kubelet[2380]: E0321 13:31:49.149209 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-4-20a459a426.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.154290 kubelet[2380]: I0321 13:31:49.154080 2380 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 21 13:31:49.162559 kubelet[2380]: I0321 13:31:49.162536 2380 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 13:31:49.162626 kubelet[2380]: W0321 13:31:49.162592 2380 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 21 13:31:49.163509 kubelet[2380]: I0321 13:31:49.163127 2380 server.go:1264] "Started kubelet" Mar 21 13:31:49.163509 kubelet[2380]: W0321 13:31:49.163253 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.163509 kubelet[2380]: E0321 13:31:49.163311 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.189486 kubelet[2380]: I0321 13:31:49.188854 2380 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 13:31:49.189486 kubelet[2380]: I0321 13:31:49.189404 2380 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 13:31:49.189746 kubelet[2380]: I0321 13:31:49.189726 2380 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 13:31:49.189937 kubelet[2380]: E0321 13:31:49.189839 2380 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.107:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.107:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-9999-0-3-4-20a459a426.novalocal.182ed4a494154540 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-3-4-20a459a426.novalocal,UID:ci-9999-0-3-4-20a459a426.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-3-4-20a459a426.novalocal,},FirstTimestamp:2025-03-21 13:31:49.163107648 +0000 UTC m=+1.455560094,LastTimestamp:2025-03-21 13:31:49.163107648 +0000 UTC m=+1.455560094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-3-4-20a459a426.novalocal,}" Mar 21 13:31:49.190734 kubelet[2380]: I0321 13:31:49.190409 2380 server.go:455] "Adding debug handlers to kubelet server" Mar 21 13:31:49.199996 kubelet[2380]: I0321 13:31:49.199973 2380 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 13:31:49.202810 kubelet[2380]: E0321 13:31:49.202788 2380 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 21 13:31:49.214256 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2398) Mar 21 13:31:49.214322 kubelet[2380]: I0321 13:31:49.213090 2380 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 21 13:31:49.215098 kubelet[2380]: I0321 13:31:49.214655 2380 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 21 13:31:49.215098 kubelet[2380]: I0321 13:31:49.214713 2380 reconciler.go:26] "Reconciler: start to sync state" Mar 21 13:31:49.215098 kubelet[2380]: W0321 13:31:49.215036 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.215098 kubelet[2380]: E0321 13:31:49.215081 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.215225 kubelet[2380]: E0321 13:31:49.215132 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-4-20a459a426.novalocal?timeout=10s\": dial tcp 172.24.4.107:6443: connect: connection refused" interval="200ms" Mar 21 13:31:49.235180 kubelet[2380]: I0321 13:31:49.234681 2380 factory.go:221] Registration of the containerd container factory successfully Mar 21 13:31:49.235180 kubelet[2380]: I0321 13:31:49.234702 2380 factory.go:221] Registration of the systemd container factory successfully Mar 21 13:31:49.235180 kubelet[2380]: I0321 13:31:49.234792 2380 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 21 13:31:49.273773 kubelet[2380]: I0321 13:31:49.273750 2380 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 21 13:31:49.274356 kubelet[2380]: I0321 13:31:49.274343 2380 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 21 13:31:49.274460 kubelet[2380]: I0321 13:31:49.274426 2380 state_mem.go:36] "Initialized new in-memory state store" Mar 21 13:31:49.274798 kubelet[2380]: I0321 13:31:49.274765 2380 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 13:31:49.277533 kubelet[2380]: I0321 13:31:49.277511 2380 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 13:31:49.278470 kubelet[2380]: I0321 13:31:49.277602 2380 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 13:31:49.278470 kubelet[2380]: I0321 13:31:49.277625 2380 kubelet.go:2337] "Starting kubelet main sync loop" Mar 21 13:31:49.278470 kubelet[2380]: E0321 13:31:49.277662 2380 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 13:31:49.279158 kubelet[2380]: W0321 13:31:49.279013 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.279251 kubelet[2380]: E0321 13:31:49.279221 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:49.281978 kubelet[2380]: I0321 13:31:49.281931 2380 policy_none.go:49] "None policy: Start" Mar 21 13:31:49.282919 kubelet[2380]: I0321 13:31:49.282628 2380 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 13:31:49.282919 kubelet[2380]: I0321 13:31:49.282649 2380 state_mem.go:35] "Initializing new in-memory state store" Mar 21 13:31:49.290379 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 21 13:31:49.307335 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 21 13:31:49.310842 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 21 13:31:49.315014 kubelet[2380]: I0321 13:31:49.314741 2380 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.315114 kubelet[2380]: E0321 13:31:49.315037 2380 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.107:6443/api/v1/nodes\": dial tcp 172.24.4.107:6443: connect: connection refused" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.320783 kubelet[2380]: I0321 13:31:49.320110 2380 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 13:31:49.320783 kubelet[2380]: I0321 13:31:49.320262 2380 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 13:31:49.320783 kubelet[2380]: I0321 13:31:49.320367 2380 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 13:31:49.322563 kubelet[2380]: E0321 13:31:49.322548 2380 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:49.377944 kubelet[2380]: I0321 13:31:49.377877 2380 topology_manager.go:215] "Topology Admit Handler" podUID="f66d2b5be1f30e6601075988b6283499" podNamespace="kube-system" podName="kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.381048 kubelet[2380]: I0321 13:31:49.380844 2380 topology_manager.go:215] "Topology Admit Handler" podUID="32dfb10191a1a78e54c1e3a24a89d4c9" podNamespace="kube-system" podName="kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.387648 kubelet[2380]: I0321 13:31:49.386029 2380 topology_manager.go:215] "Topology Admit Handler" podUID="5dffe885ae7e967c8aa6ad6ae4490f4e" podNamespace="kube-system" podName="kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.404612 systemd[1]: Created slice kubepods-burstable-podf66d2b5be1f30e6601075988b6283499.slice - libcontainer container kubepods-burstable-podf66d2b5be1f30e6601075988b6283499.slice. Mar 21 13:31:49.416318 kubelet[2380]: E0321 13:31:49.416248 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-4-20a459a426.novalocal?timeout=10s\": dial tcp 172.24.4.107:6443: connect: connection refused" interval="400ms" Mar 21 13:31:49.426315 systemd[1]: Created slice kubepods-burstable-pod32dfb10191a1a78e54c1e3a24a89d4c9.slice - libcontainer container kubepods-burstable-pod32dfb10191a1a78e54c1e3a24a89d4c9.slice. Mar 21 13:31:49.435126 systemd[1]: Created slice kubepods-burstable-pod5dffe885ae7e967c8aa6ad6ae4490f4e.slice - libcontainer container kubepods-burstable-pod5dffe885ae7e967c8aa6ad6ae4490f4e.slice. Mar 21 13:31:49.516738 kubelet[2380]: I0321 13:31:49.516126 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f66d2b5be1f30e6601075988b6283499-k8s-certs\") pod \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"f66d2b5be1f30e6601075988b6283499\") " pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.516738 kubelet[2380]: I0321 13:31:49.516208 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f66d2b5be1f30e6601075988b6283499-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"f66d2b5be1f30e6601075988b6283499\") " pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.516738 kubelet[2380]: I0321 13:31:49.516257 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-ca-certs\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.516738 kubelet[2380]: I0321 13:31:49.516299 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.516738 kubelet[2380]: I0321 13:31:49.516345 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dffe885ae7e967c8aa6ad6ae4490f4e-kubeconfig\") pod \"kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"5dffe885ae7e967c8aa6ad6ae4490f4e\") " pod="kube-system/kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.517292 kubelet[2380]: I0321 13:31:49.516388 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f66d2b5be1f30e6601075988b6283499-ca-certs\") pod \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"f66d2b5be1f30e6601075988b6283499\") " pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.517292 kubelet[2380]: I0321 13:31:49.516431 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.517292 kubelet[2380]: I0321 13:31:49.516534 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.517292 kubelet[2380]: I0321 13:31:49.516580 2380 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.519704 kubelet[2380]: I0321 13:31:49.519253 2380 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.519884 kubelet[2380]: E0321 13:31:49.519824 2380 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.107:6443/api/v1/nodes\": dial tcp 172.24.4.107:6443: connect: connection refused" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.720798 containerd[1480]: time="2025-03-21T13:31:49.720244183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal,Uid:f66d2b5be1f30e6601075988b6283499,Namespace:kube-system,Attempt:0,}" Mar 21 13:31:49.733765 containerd[1480]: time="2025-03-21T13:31:49.733615039Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal,Uid:32dfb10191a1a78e54c1e3a24a89d4c9,Namespace:kube-system,Attempt:0,}" Mar 21 13:31:49.742686 containerd[1480]: time="2025-03-21T13:31:49.742405077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal,Uid:5dffe885ae7e967c8aa6ad6ae4490f4e,Namespace:kube-system,Attempt:0,}" Mar 21 13:31:49.818105 kubelet[2380]: E0321 13:31:49.817990 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-4-20a459a426.novalocal?timeout=10s\": dial tcp 172.24.4.107:6443: connect: connection refused" interval="800ms" Mar 21 13:31:49.923646 kubelet[2380]: I0321 13:31:49.923568 2380 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:49.924509 kubelet[2380]: E0321 13:31:49.924315 2380 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.107:6443/api/v1/nodes\": dial tcp 172.24.4.107:6443: connect: connection refused" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:50.171551 kubelet[2380]: W0321 13:31:50.171426 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.171551 kubelet[2380]: E0321 13:31:50.171563 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.309903 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1145409729.mount: Deactivated successfully. Mar 21 13:31:50.319272 containerd[1480]: time="2025-03-21T13:31:50.319176688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 13:31:50.325536 containerd[1480]: time="2025-03-21T13:31:50.325193337Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 21 13:31:50.327220 containerd[1480]: time="2025-03-21T13:31:50.326965022Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 13:31:50.328654 containerd[1480]: time="2025-03-21T13:31:50.328535974Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 13:31:50.331813 containerd[1480]: time="2025-03-21T13:31:50.331751903Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 13:31:50.334141 containerd[1480]: time="2025-03-21T13:31:50.333433747Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 21 13:31:50.335933 containerd[1480]: time="2025-03-21T13:31:50.335726491Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 21 13:31:50.337236 containerd[1480]: time="2025-03-21T13:31:50.337195209Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 21 13:31:50.342537 containerd[1480]: time="2025-03-21T13:31:50.341688963Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 615.444359ms" Mar 21 13:31:50.345784 containerd[1480]: time="2025-03-21T13:31:50.345704755Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 607.785835ms" Mar 21 13:31:50.349628 containerd[1480]: time="2025-03-21T13:31:50.349558474Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 597.90831ms" Mar 21 13:31:50.370607 kubelet[2380]: W0321 13:31:50.370483 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-4-20a459a426.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.370728 kubelet[2380]: E0321 13:31:50.370619 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-9999-0-3-4-20a459a426.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.417458 containerd[1480]: time="2025-03-21T13:31:50.417123151Z" level=info msg="connecting to shim d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331" address="unix:///run/containerd/s/0a1c71b1fc04c78b0b42f50b7055a966c5b771ea1b75826bb06c8e23151c68f6" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:31:50.419024 containerd[1480]: time="2025-03-21T13:31:50.418997641Z" level=info msg="connecting to shim 2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c" address="unix:///run/containerd/s/945cb6fa1f367776ed3aa80119855495d392250ce61532868ff68e64eaedb6a4" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:31:50.421331 containerd[1480]: time="2025-03-21T13:31:50.421302438Z" level=info msg="connecting to shim 3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0" address="unix:///run/containerd/s/469771967387c3f0dbaca9995db687c14d34bad077727b2dcceb82153d8eb70f" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:31:50.430958 kubelet[2380]: W0321 13:31:50.430521 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.430958 kubelet[2380]: E0321 13:31:50.430560 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.440475 kubelet[2380]: W0321 13:31:50.440117 2380 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.440475 kubelet[2380]: E0321 13:31:50.440181 2380 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.107:6443: connect: connection refused Mar 21 13:31:50.451813 systemd[1]: Started cri-containerd-d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331.scope - libcontainer container d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331. Mar 21 13:31:50.466585 systemd[1]: Started cri-containerd-2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c.scope - libcontainer container 2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c. Mar 21 13:31:50.471951 systemd[1]: Started cri-containerd-3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0.scope - libcontainer container 3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0. Mar 21 13:31:50.539146 containerd[1480]: time="2025-03-21T13:31:50.539012951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal,Uid:32dfb10191a1a78e54c1e3a24a89d4c9,Namespace:kube-system,Attempt:0,} returns sandbox id \"d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331\"" Mar 21 13:31:50.545208 containerd[1480]: time="2025-03-21T13:31:50.545011949Z" level=info msg="CreateContainer within sandbox \"d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 21 13:31:50.552313 containerd[1480]: time="2025-03-21T13:31:50.552221739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal,Uid:f66d2b5be1f30e6601075988b6283499,Namespace:kube-system,Attempt:0,} returns sandbox id \"2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c\"" Mar 21 13:31:50.557818 containerd[1480]: time="2025-03-21T13:31:50.557652573Z" level=info msg="CreateContainer within sandbox \"2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 21 13:31:50.560150 containerd[1480]: time="2025-03-21T13:31:50.560116376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal,Uid:5dffe885ae7e967c8aa6ad6ae4490f4e,Namespace:kube-system,Attempt:0,} returns sandbox id \"3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0\"" Mar 21 13:31:50.562156 containerd[1480]: time="2025-03-21T13:31:50.562126370Z" level=info msg="CreateContainer within sandbox \"3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 21 13:31:50.568757 containerd[1480]: time="2025-03-21T13:31:50.568723276Z" level=info msg="Container 9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:31:50.574762 containerd[1480]: time="2025-03-21T13:31:50.574733564Z" level=info msg="Container 3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:31:50.589362 containerd[1480]: time="2025-03-21T13:31:50.589244963Z" level=info msg="CreateContainer within sandbox \"d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1\"" Mar 21 13:31:50.590112 containerd[1480]: time="2025-03-21T13:31:50.590087171Z" level=info msg="StartContainer for \"9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1\"" Mar 21 13:31:50.591395 containerd[1480]: time="2025-03-21T13:31:50.591370916Z" level=info msg="connecting to shim 9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1" address="unix:///run/containerd/s/0a1c71b1fc04c78b0b42f50b7055a966c5b771ea1b75826bb06c8e23151c68f6" protocol=ttrpc version=3 Mar 21 13:31:50.594726 containerd[1480]: time="2025-03-21T13:31:50.594703235Z" level=info msg="CreateContainer within sandbox \"2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df\"" Mar 21 13:31:50.597640 containerd[1480]: time="2025-03-21T13:31:50.597600259Z" level=info msg="StartContainer for \"3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df\"" Mar 21 13:31:50.598923 containerd[1480]: time="2025-03-21T13:31:50.598754229Z" level=info msg="connecting to shim 3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df" address="unix:///run/containerd/s/945cb6fa1f367776ed3aa80119855495d392250ce61532868ff68e64eaedb6a4" protocol=ttrpc version=3 Mar 21 13:31:50.603175 containerd[1480]: time="2025-03-21T13:31:50.603137344Z" level=info msg="Container 93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:31:50.618387 kubelet[2380]: E0321 13:31:50.618340 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-9999-0-3-4-20a459a426.novalocal?timeout=10s\": dial tcp 172.24.4.107:6443: connect: connection refused" interval="1.6s" Mar 21 13:31:50.620387 containerd[1480]: time="2025-03-21T13:31:50.620334325Z" level=info msg="CreateContainer within sandbox \"3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0\"" Mar 21 13:31:50.621489 containerd[1480]: time="2025-03-21T13:31:50.620643211Z" level=info msg="StartContainer for \"93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0\"" Mar 21 13:31:50.621806 systemd[1]: Started cri-containerd-9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1.scope - libcontainer container 9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1. Mar 21 13:31:50.622041 containerd[1480]: time="2025-03-21T13:31:50.622001451Z" level=info msg="connecting to shim 93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0" address="unix:///run/containerd/s/469771967387c3f0dbaca9995db687c14d34bad077727b2dcceb82153d8eb70f" protocol=ttrpc version=3 Mar 21 13:31:50.625884 systemd[1]: Started cri-containerd-3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df.scope - libcontainer container 3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df. Mar 21 13:31:50.659578 systemd[1]: Started cri-containerd-93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0.scope - libcontainer container 93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0. Mar 21 13:31:50.710008 containerd[1480]: time="2025-03-21T13:31:50.708736848Z" level=info msg="StartContainer for \"9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1\" returns successfully" Mar 21 13:31:50.728459 kubelet[2380]: I0321 13:31:50.727981 2380 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:50.728459 kubelet[2380]: E0321 13:31:50.728267 2380 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.107:6443/api/v1/nodes\": dial tcp 172.24.4.107:6443: connect: connection refused" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:50.741881 containerd[1480]: time="2025-03-21T13:31:50.741787336Z" level=info msg="StartContainer for \"3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df\" returns successfully" Mar 21 13:31:50.747778 containerd[1480]: time="2025-03-21T13:31:50.747744569Z" level=info msg="StartContainer for \"93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0\" returns successfully" Mar 21 13:31:52.329964 kubelet[2380]: I0321 13:31:52.329585 2380 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:52.617808 kubelet[2380]: E0321 13:31:52.617749 2380 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-9999-0-3-4-20a459a426.novalocal\" not found" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:52.662142 kubelet[2380]: I0321 13:31:52.662091 2380 kubelet_node_status.go:76] "Successfully registered node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:52.670767 kubelet[2380]: E0321 13:31:52.670741 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:52.677866 kubelet[2380]: E0321 13:31:52.677769 2380 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-9999-0-3-4-20a459a426.novalocal.182ed4a494154540 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-3-4-20a459a426.novalocal,UID:ci-9999-0-3-4-20a459a426.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-9999-0-3-4-20a459a426.novalocal,},FirstTimestamp:2025-03-21 13:31:49.163107648 +0000 UTC m=+1.455560094,LastTimestamp:2025-03-21 13:31:49.163107648 +0000 UTC m=+1.455560094,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-3-4-20a459a426.novalocal,}" Mar 21 13:31:52.737448 kubelet[2380]: E0321 13:31:52.737223 2380 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-9999-0-3-4-20a459a426.novalocal.182ed4a4967286ac default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-9999-0-3-4-20a459a426.novalocal,UID:ci-9999-0-3-4-20a459a426.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:ci-9999-0-3-4-20a459a426.novalocal,},FirstTimestamp:2025-03-21 13:31:49.202773676 +0000 UTC m=+1.495226132,LastTimestamp:2025-03-21 13:31:49.202773676 +0000 UTC m=+1.495226132,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-9999-0-3-4-20a459a426.novalocal,}" Mar 21 13:31:52.771068 kubelet[2380]: E0321 13:31:52.771026 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:52.871854 kubelet[2380]: E0321 13:31:52.871510 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:52.972498 kubelet[2380]: E0321 13:31:52.972463 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:53.073410 kubelet[2380]: E0321 13:31:53.073372 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:53.174061 kubelet[2380]: E0321 13:31:53.173868 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:53.274044 kubelet[2380]: E0321 13:31:53.273978 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:53.374838 kubelet[2380]: E0321 13:31:53.374718 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:53.475928 kubelet[2380]: E0321 13:31:53.475730 2380 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-9999-0-3-4-20a459a426.novalocal\" not found" Mar 21 13:31:54.153057 kubelet[2380]: I0321 13:31:54.152970 2380 apiserver.go:52] "Watching apiserver" Mar 21 13:31:54.215937 kubelet[2380]: I0321 13:31:54.215779 2380 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 21 13:31:55.411956 systemd[1]: Reload requested from client PID 2671 ('systemctl') (unit session-9.scope)... Mar 21 13:31:55.411993 systemd[1]: Reloading... Mar 21 13:31:55.526507 zram_generator::config[2713]: No configuration found. Mar 21 13:31:55.692204 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 21 13:31:55.833953 systemd[1]: Reloading finished in 421 ms. Mar 21 13:31:55.857327 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:55.868313 systemd[1]: kubelet.service: Deactivated successfully. Mar 21 13:31:55.868677 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:55.868732 systemd[1]: kubelet.service: Consumed 1.136s CPU time, 118.1M memory peak. Mar 21 13:31:55.870322 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 21 13:31:56.060613 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 21 13:31:56.068647 (kubelet)[2781]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 21 13:31:56.128399 kubelet[2781]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 13:31:56.128399 kubelet[2781]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 21 13:31:56.128399 kubelet[2781]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 21 13:31:56.129836 kubelet[2781]: I0321 13:31:56.128965 2781 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 21 13:31:56.136305 kubelet[2781]: I0321 13:31:56.136251 2781 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 21 13:31:56.136914 kubelet[2781]: I0321 13:31:56.136543 2781 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 21 13:31:56.137073 kubelet[2781]: I0321 13:31:56.137053 2781 server.go:927] "Client rotation is on, will bootstrap in background" Mar 21 13:31:56.140810 kubelet[2781]: I0321 13:31:56.140783 2781 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 21 13:31:56.143316 kubelet[2781]: I0321 13:31:56.142866 2781 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 21 13:31:56.152510 kubelet[2781]: I0321 13:31:56.151791 2781 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 21 13:31:56.152510 kubelet[2781]: I0321 13:31:56.151968 2781 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 21 13:31:56.152510 kubelet[2781]: I0321 13:31:56.152004 2781 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-9999-0-3-4-20a459a426.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 21 13:31:56.152510 kubelet[2781]: I0321 13:31:56.152294 2781 topology_manager.go:138] "Creating topology manager with none policy" Mar 21 13:31:56.152890 kubelet[2781]: I0321 13:31:56.152304 2781 container_manager_linux.go:301] "Creating device plugin manager" Mar 21 13:31:56.152890 kubelet[2781]: I0321 13:31:56.152342 2781 state_mem.go:36] "Initialized new in-memory state store" Mar 21 13:31:56.152890 kubelet[2781]: I0321 13:31:56.152420 2781 kubelet.go:400] "Attempting to sync node with API server" Mar 21 13:31:56.153009 kubelet[2781]: I0321 13:31:56.152998 2781 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 21 13:31:56.153133 kubelet[2781]: I0321 13:31:56.153122 2781 kubelet.go:312] "Adding apiserver pod source" Mar 21 13:31:56.153231 kubelet[2781]: I0321 13:31:56.153221 2781 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 21 13:31:56.156164 kubelet[2781]: I0321 13:31:56.156147 2781 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 21 13:31:56.156428 kubelet[2781]: I0321 13:31:56.156415 2781 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 21 13:31:56.156988 kubelet[2781]: I0321 13:31:56.156974 2781 server.go:1264] "Started kubelet" Mar 21 13:31:56.161792 kubelet[2781]: I0321 13:31:56.160103 2781 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 21 13:31:56.171358 kubelet[2781]: I0321 13:31:56.171329 2781 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 21 13:31:56.172349 kubelet[2781]: I0321 13:31:56.172334 2781 server.go:455] "Adding debug handlers to kubelet server" Mar 21 13:31:56.174311 kubelet[2781]: I0321 13:31:56.174271 2781 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 21 13:31:56.174635 kubelet[2781]: I0321 13:31:56.174622 2781 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 21 13:31:56.176961 kubelet[2781]: I0321 13:31:56.176946 2781 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 21 13:31:56.180867 kubelet[2781]: I0321 13:31:56.180849 2781 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 21 13:31:56.181162 kubelet[2781]: I0321 13:31:56.181150 2781 reconciler.go:26] "Reconciler: start to sync state" Mar 21 13:31:56.183195 kubelet[2781]: I0321 13:31:56.182956 2781 factory.go:221] Registration of the systemd container factory successfully Mar 21 13:31:56.183195 kubelet[2781]: I0321 13:31:56.183053 2781 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 21 13:31:56.186761 kubelet[2781]: I0321 13:31:56.184980 2781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 21 13:31:56.186761 kubelet[2781]: E0321 13:31:56.186279 2781 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 21 13:31:56.187817 kubelet[2781]: I0321 13:31:56.187784 2781 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 21 13:31:56.187817 kubelet[2781]: I0321 13:31:56.187811 2781 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 21 13:31:56.187891 kubelet[2781]: I0321 13:31:56.187827 2781 kubelet.go:2337] "Starting kubelet main sync loop" Mar 21 13:31:56.187891 kubelet[2781]: E0321 13:31:56.187861 2781 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 21 13:31:56.191106 kubelet[2781]: I0321 13:31:56.190992 2781 factory.go:221] Registration of the containerd container factory successfully Mar 21 13:31:56.254192 kubelet[2781]: I0321 13:31:56.253708 2781 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 21 13:31:56.254192 kubelet[2781]: I0321 13:31:56.253728 2781 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 21 13:31:56.254192 kubelet[2781]: I0321 13:31:56.253750 2781 state_mem.go:36] "Initialized new in-memory state store" Mar 21 13:31:56.254192 kubelet[2781]: I0321 13:31:56.253919 2781 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 21 13:31:56.254192 kubelet[2781]: I0321 13:31:56.253931 2781 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 21 13:31:56.254192 kubelet[2781]: I0321 13:31:56.253949 2781 policy_none.go:49] "None policy: Start" Mar 21 13:31:56.256029 kubelet[2781]: I0321 13:31:56.254776 2781 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 21 13:31:56.256029 kubelet[2781]: I0321 13:31:56.254803 2781 state_mem.go:35] "Initializing new in-memory state store" Mar 21 13:31:56.256029 kubelet[2781]: I0321 13:31:56.254993 2781 state_mem.go:75] "Updated machine memory state" Mar 21 13:31:56.262793 kubelet[2781]: I0321 13:31:56.262771 2781 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 21 13:31:56.263317 kubelet[2781]: I0321 13:31:56.263281 2781 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 21 13:31:56.263877 kubelet[2781]: I0321 13:31:56.263865 2781 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 21 13:31:56.287257 kubelet[2781]: I0321 13:31:56.287234 2781 kubelet_node_status.go:73] "Attempting to register node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.289347 kubelet[2781]: I0321 13:31:56.289289 2781 topology_manager.go:215] "Topology Admit Handler" podUID="f66d2b5be1f30e6601075988b6283499" podNamespace="kube-system" podName="kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.289570 kubelet[2781]: I0321 13:31:56.289555 2781 topology_manager.go:215] "Topology Admit Handler" podUID="32dfb10191a1a78e54c1e3a24a89d4c9" podNamespace="kube-system" podName="kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.289737 kubelet[2781]: I0321 13:31:56.289714 2781 topology_manager.go:215] "Topology Admit Handler" podUID="5dffe885ae7e967c8aa6ad6ae4490f4e" podNamespace="kube-system" podName="kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.299403 kubelet[2781]: I0321 13:31:56.299360 2781 kubelet_node_status.go:112] "Node was previously registered" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.299538 kubelet[2781]: I0321 13:31:56.299471 2781 kubelet_node_status.go:76] "Successfully registered node" node="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.301714 kubelet[2781]: W0321 13:31:56.301597 2781 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 13:31:56.303856 kubelet[2781]: W0321 13:31:56.303795 2781 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 13:31:56.305964 kubelet[2781]: W0321 13:31:56.305948 2781 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 13:31:56.483684 kubelet[2781]: I0321 13:31:56.483538 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f66d2b5be1f30e6601075988b6283499-k8s-certs\") pod \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"f66d2b5be1f30e6601075988b6283499\") " pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.483684 kubelet[2781]: I0321 13:31:56.483624 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-k8s-certs\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.483684 kubelet[2781]: I0321 13:31:56.483678 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.484622 kubelet[2781]: I0321 13:31:56.483727 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5dffe885ae7e967c8aa6ad6ae4490f4e-kubeconfig\") pod \"kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"5dffe885ae7e967c8aa6ad6ae4490f4e\") " pod="kube-system/kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.484622 kubelet[2781]: I0321 13:31:56.483776 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f66d2b5be1f30e6601075988b6283499-usr-share-ca-certificates\") pod \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"f66d2b5be1f30e6601075988b6283499\") " pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.484622 kubelet[2781]: I0321 13:31:56.483820 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-ca-certs\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.484622 kubelet[2781]: I0321 13:31:56.483863 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-flexvolume-dir\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.484893 kubelet[2781]: I0321 13:31:56.483909 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/32dfb10191a1a78e54c1e3a24a89d4c9-kubeconfig\") pod \"kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"32dfb10191a1a78e54c1e3a24a89d4c9\") " pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:56.484893 kubelet[2781]: I0321 13:31:56.483953 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f66d2b5be1f30e6601075988b6283499-ca-certs\") pod \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" (UID: \"f66d2b5be1f30e6601075988b6283499\") " pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:57.154193 kubelet[2781]: I0321 13:31:57.154152 2781 apiserver.go:52] "Watching apiserver" Mar 21 13:31:57.181316 kubelet[2781]: I0321 13:31:57.181268 2781 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 21 13:31:57.240644 kubelet[2781]: W0321 13:31:57.239709 2781 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 21 13:31:57.240644 kubelet[2781]: E0321 13:31:57.239766 2781 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:31:57.259157 kubelet[2781]: I0321 13:31:57.259093 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-9999-0-3-4-20a459a426.novalocal" podStartSLOduration=1.258962034 podStartE2EDuration="1.258962034s" podCreationTimestamp="2025-03-21 13:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 13:31:57.258866019 +0000 UTC m=+1.185084400" watchObservedRunningTime="2025-03-21 13:31:57.258962034 +0000 UTC m=+1.185180415" Mar 21 13:31:57.287893 kubelet[2781]: I0321 13:31:57.287777 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-9999-0-3-4-20a459a426.novalocal" podStartSLOduration=1.287761093 podStartE2EDuration="1.287761093s" podCreationTimestamp="2025-03-21 13:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 13:31:57.275995147 +0000 UTC m=+1.202213528" watchObservedRunningTime="2025-03-21 13:31:57.287761093 +0000 UTC m=+1.213979484" Mar 21 13:31:57.301021 kubelet[2781]: I0321 13:31:57.300976 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-9999-0-3-4-20a459a426.novalocal" podStartSLOduration=1.300947067 podStartE2EDuration="1.300947067s" podCreationTimestamp="2025-03-21 13:31:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 13:31:57.28819124 +0000 UTC m=+1.214409631" watchObservedRunningTime="2025-03-21 13:31:57.300947067 +0000 UTC m=+1.227165448" Mar 21 13:32:02.339778 sudo[1722]: pam_unix(sudo:session): session closed for user root Mar 21 13:32:02.529948 sshd[1721]: Connection closed by 172.24.4.1 port 47878 Mar 21 13:32:02.531023 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Mar 21 13:32:02.536121 systemd[1]: sshd@6-172.24.4.107:22-172.24.4.1:47878.service: Deactivated successfully. Mar 21 13:32:02.540276 systemd[1]: session-9.scope: Deactivated successfully. Mar 21 13:32:02.541190 systemd[1]: session-9.scope: Consumed 7.350s CPU time, 246.9M memory peak. Mar 21 13:32:02.544481 systemd-logind[1460]: Session 9 logged out. Waiting for processes to exit. Mar 21 13:32:02.545955 systemd-logind[1460]: Removed session 9. Mar 21 13:32:09.424132 kubelet[2781]: I0321 13:32:09.423961 2781 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 21 13:32:09.425714 containerd[1480]: time="2025-03-21T13:32:09.425499435Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 21 13:32:09.426322 kubelet[2781]: I0321 13:32:09.425910 2781 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 21 13:32:10.163593 kubelet[2781]: I0321 13:32:10.163519 2781 topology_manager.go:215] "Topology Admit Handler" podUID="738b48f0-72a7-473f-b4b2-a021f4bffadd" podNamespace="kube-system" podName="kube-proxy-ls7zx" Mar 21 13:32:10.172274 kubelet[2781]: I0321 13:32:10.171923 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/738b48f0-72a7-473f-b4b2-a021f4bffadd-kube-proxy\") pod \"kube-proxy-ls7zx\" (UID: \"738b48f0-72a7-473f-b4b2-a021f4bffadd\") " pod="kube-system/kube-proxy-ls7zx" Mar 21 13:32:10.172274 kubelet[2781]: I0321 13:32:10.172038 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/738b48f0-72a7-473f-b4b2-a021f4bffadd-xtables-lock\") pod \"kube-proxy-ls7zx\" (UID: \"738b48f0-72a7-473f-b4b2-a021f4bffadd\") " pod="kube-system/kube-proxy-ls7zx" Mar 21 13:32:10.172274 kubelet[2781]: I0321 13:32:10.172093 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/738b48f0-72a7-473f-b4b2-a021f4bffadd-lib-modules\") pod \"kube-proxy-ls7zx\" (UID: \"738b48f0-72a7-473f-b4b2-a021f4bffadd\") " pod="kube-system/kube-proxy-ls7zx" Mar 21 13:32:10.172274 kubelet[2781]: I0321 13:32:10.172143 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsc28\" (UniqueName: \"kubernetes.io/projected/738b48f0-72a7-473f-b4b2-a021f4bffadd-kube-api-access-hsc28\") pod \"kube-proxy-ls7zx\" (UID: \"738b48f0-72a7-473f-b4b2-a021f4bffadd\") " pod="kube-system/kube-proxy-ls7zx" Mar 21 13:32:10.186061 systemd[1]: Created slice kubepods-besteffort-pod738b48f0_72a7_473f_b4b2_a021f4bffadd.slice - libcontainer container kubepods-besteffort-pod738b48f0_72a7_473f_b4b2_a021f4bffadd.slice. Mar 21 13:32:10.343200 kubelet[2781]: I0321 13:32:10.341746 2781 topology_manager.go:215] "Topology Admit Handler" podUID="3ce084ca-524f-4b51-bcf9-f1e9f08aa838" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-426mj" Mar 21 13:32:10.356195 systemd[1]: Created slice kubepods-besteffort-pod3ce084ca_524f_4b51_bcf9_f1e9f08aa838.slice - libcontainer container kubepods-besteffort-pod3ce084ca_524f_4b51_bcf9_f1e9f08aa838.slice. Mar 21 13:32:10.372681 kubelet[2781]: I0321 13:32:10.372587 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3ce084ca-524f-4b51-bcf9-f1e9f08aa838-var-lib-calico\") pod \"tigera-operator-6479d6dc54-426mj\" (UID: \"3ce084ca-524f-4b51-bcf9-f1e9f08aa838\") " pod="tigera-operator/tigera-operator-6479d6dc54-426mj" Mar 21 13:32:10.372681 kubelet[2781]: I0321 13:32:10.372635 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjp7\" (UniqueName: \"kubernetes.io/projected/3ce084ca-524f-4b51-bcf9-f1e9f08aa838-kube-api-access-xzjp7\") pod \"tigera-operator-6479d6dc54-426mj\" (UID: \"3ce084ca-524f-4b51-bcf9-f1e9f08aa838\") " pod="tigera-operator/tigera-operator-6479d6dc54-426mj" Mar 21 13:32:10.499685 containerd[1480]: time="2025-03-21T13:32:10.499244367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ls7zx,Uid:738b48f0-72a7-473f-b4b2-a021f4bffadd,Namespace:kube-system,Attempt:0,}" Mar 21 13:32:10.549581 containerd[1480]: time="2025-03-21T13:32:10.549353651Z" level=info msg="connecting to shim 231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa" address="unix:///run/containerd/s/5d4243cb2c235fb950fbb4e29c2fcb00823c2d62db48709c236c13e869b7b515" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:10.596624 systemd[1]: Started cri-containerd-231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa.scope - libcontainer container 231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa. Mar 21 13:32:10.631208 containerd[1480]: time="2025-03-21T13:32:10.631171997Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-ls7zx,Uid:738b48f0-72a7-473f-b4b2-a021f4bffadd,Namespace:kube-system,Attempt:0,} returns sandbox id \"231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa\"" Mar 21 13:32:10.635929 containerd[1480]: time="2025-03-21T13:32:10.635270602Z" level=info msg="CreateContainer within sandbox \"231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 21 13:32:10.653919 containerd[1480]: time="2025-03-21T13:32:10.653883791Z" level=info msg="Container 29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:10.661343 containerd[1480]: time="2025-03-21T13:32:10.661286517Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-426mj,Uid:3ce084ca-524f-4b51-bcf9-f1e9f08aa838,Namespace:tigera-operator,Attempt:0,}" Mar 21 13:32:10.668401 containerd[1480]: time="2025-03-21T13:32:10.668371414Z" level=info msg="CreateContainer within sandbox \"231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867\"" Mar 21 13:32:10.668980 containerd[1480]: time="2025-03-21T13:32:10.668870350Z" level=info msg="StartContainer for \"29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867\"" Mar 21 13:32:10.670828 containerd[1480]: time="2025-03-21T13:32:10.670801164Z" level=info msg="connecting to shim 29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867" address="unix:///run/containerd/s/5d4243cb2c235fb950fbb4e29c2fcb00823c2d62db48709c236c13e869b7b515" protocol=ttrpc version=3 Mar 21 13:32:10.694996 systemd[1]: Started cri-containerd-29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867.scope - libcontainer container 29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867. Mar 21 13:32:10.696534 containerd[1480]: time="2025-03-21T13:32:10.696501283Z" level=info msg="connecting to shim 0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636" address="unix:///run/containerd/s/f6ddbd3951f64768f36dbca07b2c208c914d77e9303e6b20d9d841c87a13adfb" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:10.728592 systemd[1]: Started cri-containerd-0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636.scope - libcontainer container 0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636. Mar 21 13:32:10.770005 containerd[1480]: time="2025-03-21T13:32:10.769414063Z" level=info msg="StartContainer for \"29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867\" returns successfully" Mar 21 13:32:10.800070 containerd[1480]: time="2025-03-21T13:32:10.800017972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-426mj,Uid:3ce084ca-524f-4b51-bcf9-f1e9f08aa838,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636\"" Mar 21 13:32:10.802770 containerd[1480]: time="2025-03-21T13:32:10.802669613Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 21 13:32:11.301353 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1209639011.mount: Deactivated successfully. Mar 21 13:32:12.853779 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1625885010.mount: Deactivated successfully. Mar 21 13:32:15.156876 containerd[1480]: time="2025-03-21T13:32:15.156770187Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:15.158153 containerd[1480]: time="2025-03-21T13:32:15.157953049Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 21 13:32:15.159461 containerd[1480]: time="2025-03-21T13:32:15.159419388Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:15.163272 containerd[1480]: time="2025-03-21T13:32:15.162493749Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:15.163432 containerd[1480]: time="2025-03-21T13:32:15.163392261Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 4.360626792s" Mar 21 13:32:15.163553 containerd[1480]: time="2025-03-21T13:32:15.163535468Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 21 13:32:15.167206 containerd[1480]: time="2025-03-21T13:32:15.167166917Z" level=info msg="CreateContainer within sandbox \"0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 21 13:32:15.180957 containerd[1480]: time="2025-03-21T13:32:15.180916947Z" level=info msg="Container d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:15.187251 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4141573469.mount: Deactivated successfully. Mar 21 13:32:15.191690 containerd[1480]: time="2025-03-21T13:32:15.191646997Z" level=info msg="CreateContainer within sandbox \"0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357\"" Mar 21 13:32:15.194925 containerd[1480]: time="2025-03-21T13:32:15.193927811Z" level=info msg="StartContainer for \"d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357\"" Mar 21 13:32:15.194925 containerd[1480]: time="2025-03-21T13:32:15.194857762Z" level=info msg="connecting to shim d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357" address="unix:///run/containerd/s/f6ddbd3951f64768f36dbca07b2c208c914d77e9303e6b20d9d841c87a13adfb" protocol=ttrpc version=3 Mar 21 13:32:15.220643 systemd[1]: Started cri-containerd-d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357.scope - libcontainer container d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357. Mar 21 13:32:15.255864 containerd[1480]: time="2025-03-21T13:32:15.255818467Z" level=info msg="StartContainer for \"d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357\" returns successfully" Mar 21 13:32:15.298471 kubelet[2781]: I0321 13:32:15.298018 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-ls7zx" podStartSLOduration=5.297995162 podStartE2EDuration="5.297995162s" podCreationTimestamp="2025-03-21 13:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 13:32:11.291497187 +0000 UTC m=+15.217715598" watchObservedRunningTime="2025-03-21 13:32:15.297995162 +0000 UTC m=+19.224213544" Mar 21 13:32:18.820921 kubelet[2781]: I0321 13:32:18.820781 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-426mj" podStartSLOduration=4.458311509 podStartE2EDuration="8.820745399s" podCreationTimestamp="2025-03-21 13:32:10 +0000 UTC" firstStartedPulling="2025-03-21 13:32:10.801989942 +0000 UTC m=+14.728208323" lastFinishedPulling="2025-03-21 13:32:15.164423832 +0000 UTC m=+19.090642213" observedRunningTime="2025-03-21 13:32:15.300688264 +0000 UTC m=+19.226906645" watchObservedRunningTime="2025-03-21 13:32:18.820745399 +0000 UTC m=+22.746963830" Mar 21 13:32:18.822816 kubelet[2781]: I0321 13:32:18.821037 2781 topology_manager.go:215] "Topology Admit Handler" podUID="e2576259-a3da-4790-bcdc-f3c858add0a5" podNamespace="calico-system" podName="calico-typha-7c5f65576d-bhkrq" Mar 21 13:32:18.831409 kubelet[2781]: I0321 13:32:18.831294 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2576259-a3da-4790-bcdc-f3c858add0a5-tigera-ca-bundle\") pod \"calico-typha-7c5f65576d-bhkrq\" (UID: \"e2576259-a3da-4790-bcdc-f3c858add0a5\") " pod="calico-system/calico-typha-7c5f65576d-bhkrq" Mar 21 13:32:18.831409 kubelet[2781]: I0321 13:32:18.831378 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e2576259-a3da-4790-bcdc-f3c858add0a5-typha-certs\") pod \"calico-typha-7c5f65576d-bhkrq\" (UID: \"e2576259-a3da-4790-bcdc-f3c858add0a5\") " pod="calico-system/calico-typha-7c5f65576d-bhkrq" Mar 21 13:32:18.832179 kubelet[2781]: I0321 13:32:18.831491 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdwvv\" (UniqueName: \"kubernetes.io/projected/e2576259-a3da-4790-bcdc-f3c858add0a5-kube-api-access-fdwvv\") pod \"calico-typha-7c5f65576d-bhkrq\" (UID: \"e2576259-a3da-4790-bcdc-f3c858add0a5\") " pod="calico-system/calico-typha-7c5f65576d-bhkrq" Mar 21 13:32:18.838958 systemd[1]: Created slice kubepods-besteffort-pode2576259_a3da_4790_bcdc_f3c858add0a5.slice - libcontainer container kubepods-besteffort-pode2576259_a3da_4790_bcdc_f3c858add0a5.slice. Mar 21 13:32:18.849487 kubelet[2781]: W0321 13:32:18.849394 2781 reflector.go:547] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-9999-0-3-4-20a459a426.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-3-4-20a459a426.novalocal' and this object Mar 21 13:32:18.849487 kubelet[2781]: E0321 13:32:18.849487 2781 reflector.go:150] object-"calico-system"/"typha-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-9999-0-3-4-20a459a426.novalocal" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-3-4-20a459a426.novalocal' and this object Mar 21 13:32:18.849767 kubelet[2781]: W0321 13:32:18.849541 2781 reflector.go:547] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-9999-0-3-4-20a459a426.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-3-4-20a459a426.novalocal' and this object Mar 21 13:32:18.849767 kubelet[2781]: E0321 13:32:18.849555 2781 reflector.go:150] object-"calico-system"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-9999-0-3-4-20a459a426.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-3-4-20a459a426.novalocal' and this object Mar 21 13:32:18.849767 kubelet[2781]: W0321 13:32:18.849413 2781 reflector.go:547] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-9999-0-3-4-20a459a426.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-3-4-20a459a426.novalocal' and this object Mar 21 13:32:18.849767 kubelet[2781]: E0321 13:32:18.849599 2781 reflector.go:150] object-"calico-system"/"tigera-ca-bundle": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-9999-0-3-4-20a459a426.novalocal" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-9999-0-3-4-20a459a426.novalocal' and this object Mar 21 13:32:19.358985 kubelet[2781]: I0321 13:32:19.358379 2781 topology_manager.go:215] "Topology Admit Handler" podUID="0d71416e-2d02-4a15-9098-811bf718db77" podNamespace="calico-system" podName="calico-node-bcwqc" Mar 21 13:32:19.384728 systemd[1]: Created slice kubepods-besteffort-pod0d71416e_2d02_4a15_9098_811bf718db77.slice - libcontainer container kubepods-besteffort-pod0d71416e_2d02_4a15_9098_811bf718db77.slice. Mar 21 13:32:19.437980 kubelet[2781]: I0321 13:32:19.437566 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-flexvol-driver-host\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.437980 kubelet[2781]: I0321 13:32:19.437610 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpkr6\" (UniqueName: \"kubernetes.io/projected/0d71416e-2d02-4a15-9098-811bf718db77-kube-api-access-vpkr6\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.437980 kubelet[2781]: I0321 13:32:19.437645 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-cni-log-dir\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.437980 kubelet[2781]: I0321 13:32:19.437676 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d71416e-2d02-4a15-9098-811bf718db77-tigera-ca-bundle\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.437980 kubelet[2781]: I0321 13:32:19.437698 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-cni-bin-dir\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438308 kubelet[2781]: I0321 13:32:19.437724 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-xtables-lock\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438308 kubelet[2781]: I0321 13:32:19.437743 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0d71416e-2d02-4a15-9098-811bf718db77-node-certs\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438308 kubelet[2781]: I0321 13:32:19.437763 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-cni-net-dir\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438308 kubelet[2781]: I0321 13:32:19.437782 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-lib-modules\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438308 kubelet[2781]: I0321 13:32:19.437800 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-policysync\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438477 kubelet[2781]: I0321 13:32:19.437819 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-var-run-calico\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.438477 kubelet[2781]: I0321 13:32:19.437838 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0d71416e-2d02-4a15-9098-811bf718db77-var-lib-calico\") pod \"calico-node-bcwqc\" (UID: \"0d71416e-2d02-4a15-9098-811bf718db77\") " pod="calico-system/calico-node-bcwqc" Mar 21 13:32:19.484478 kubelet[2781]: I0321 13:32:19.484103 2781 topology_manager.go:215] "Topology Admit Handler" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" podNamespace="calico-system" podName="csi-node-driver-hh6d2" Mar 21 13:32:19.484478 kubelet[2781]: E0321 13:32:19.484362 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:19.539793 kubelet[2781]: I0321 13:32:19.538773 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/853e0129-1988-42f8-a91f-ebe3cb4d5800-registration-dir\") pod \"csi-node-driver-hh6d2\" (UID: \"853e0129-1988-42f8-a91f-ebe3cb4d5800\") " pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:19.539793 kubelet[2781]: I0321 13:32:19.538823 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/853e0129-1988-42f8-a91f-ebe3cb4d5800-kubelet-dir\") pod \"csi-node-driver-hh6d2\" (UID: \"853e0129-1988-42f8-a91f-ebe3cb4d5800\") " pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:19.539793 kubelet[2781]: I0321 13:32:19.538881 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/853e0129-1988-42f8-a91f-ebe3cb4d5800-varrun\") pod \"csi-node-driver-hh6d2\" (UID: \"853e0129-1988-42f8-a91f-ebe3cb4d5800\") " pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:19.539793 kubelet[2781]: I0321 13:32:19.538901 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2fc2\" (UniqueName: \"kubernetes.io/projected/853e0129-1988-42f8-a91f-ebe3cb4d5800-kube-api-access-j2fc2\") pod \"csi-node-driver-hh6d2\" (UID: \"853e0129-1988-42f8-a91f-ebe3cb4d5800\") " pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:19.539793 kubelet[2781]: I0321 13:32:19.539014 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/853e0129-1988-42f8-a91f-ebe3cb4d5800-socket-dir\") pod \"csi-node-driver-hh6d2\" (UID: \"853e0129-1988-42f8-a91f-ebe3cb4d5800\") " pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:19.548775 kubelet[2781]: E0321 13:32:19.548642 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.549514 kubelet[2781]: W0321 13:32:19.549496 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.549614 kubelet[2781]: E0321 13:32:19.549600 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.640168 kubelet[2781]: E0321 13:32:19.640067 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.640168 kubelet[2781]: W0321 13:32:19.640093 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.640168 kubelet[2781]: E0321 13:32:19.640115 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.640937 kubelet[2781]: E0321 13:32:19.640917 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.640937 kubelet[2781]: W0321 13:32:19.640932 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.641243 kubelet[2781]: E0321 13:32:19.640958 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.641625 kubelet[2781]: E0321 13:32:19.641572 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.641625 kubelet[2781]: W0321 13:32:19.641587 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.641625 kubelet[2781]: E0321 13:32:19.641603 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.641927 kubelet[2781]: E0321 13:32:19.641866 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.641927 kubelet[2781]: W0321 13:32:19.641875 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.641927 kubelet[2781]: E0321 13:32:19.641884 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.642295 kubelet[2781]: E0321 13:32:19.642099 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.642295 kubelet[2781]: W0321 13:32:19.642114 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.642295 kubelet[2781]: E0321 13:32:19.642158 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.642295 kubelet[2781]: E0321 13:32:19.642288 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.642295 kubelet[2781]: W0321 13:32:19.642299 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.642732 kubelet[2781]: E0321 13:32:19.642316 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.642732 kubelet[2781]: E0321 13:32:19.642520 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.642732 kubelet[2781]: W0321 13:32:19.642541 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.642732 kubelet[2781]: E0321 13:32:19.642558 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.642907 kubelet[2781]: E0321 13:32:19.642889 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.642907 kubelet[2781]: W0321 13:32:19.642904 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.642995 kubelet[2781]: E0321 13:32:19.642914 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.643277 kubelet[2781]: E0321 13:32:19.643258 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.643277 kubelet[2781]: W0321 13:32:19.643273 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.643450 kubelet[2781]: E0321 13:32:19.643289 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.643628 kubelet[2781]: E0321 13:32:19.643605 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.643628 kubelet[2781]: W0321 13:32:19.643621 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.643879 kubelet[2781]: E0321 13:32:19.643637 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.644081 kubelet[2781]: E0321 13:32:19.644065 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.644081 kubelet[2781]: W0321 13:32:19.644078 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.644256 kubelet[2781]: E0321 13:32:19.644153 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.644256 kubelet[2781]: E0321 13:32:19.644216 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.644256 kubelet[2781]: W0321 13:32:19.644225 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.644256 kubelet[2781]: E0321 13:32:19.644241 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.644564 kubelet[2781]: E0321 13:32:19.644406 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.644564 kubelet[2781]: W0321 13:32:19.644415 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.644564 kubelet[2781]: E0321 13:32:19.644425 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.644818 kubelet[2781]: E0321 13:32:19.644596 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.644818 kubelet[2781]: W0321 13:32:19.644616 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.644818 kubelet[2781]: E0321 13:32:19.644626 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.644818 kubelet[2781]: E0321 13:32:19.644767 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.644818 kubelet[2781]: W0321 13:32:19.644775 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.644818 kubelet[2781]: E0321 13:32:19.644785 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.645398 kubelet[2781]: E0321 13:32:19.645002 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.645398 kubelet[2781]: W0321 13:32:19.645018 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.645398 kubelet[2781]: E0321 13:32:19.645027 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.645398 kubelet[2781]: E0321 13:32:19.645242 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.645398 kubelet[2781]: W0321 13:32:19.645251 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.645398 kubelet[2781]: E0321 13:32:19.645260 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.645790 kubelet[2781]: E0321 13:32:19.645490 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.645790 kubelet[2781]: W0321 13:32:19.645500 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.645790 kubelet[2781]: E0321 13:32:19.645535 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.645790 kubelet[2781]: E0321 13:32:19.645677 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.645790 kubelet[2781]: W0321 13:32:19.645687 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.645790 kubelet[2781]: E0321 13:32:19.645754 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.646064 kubelet[2781]: E0321 13:32:19.645969 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.646064 kubelet[2781]: W0321 13:32:19.645978 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.646064 kubelet[2781]: E0321 13:32:19.646006 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.646143 kubelet[2781]: E0321 13:32:19.646130 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.646143 kubelet[2781]: W0321 13:32:19.646140 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.646409 kubelet[2781]: E0321 13:32:19.646173 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.646409 kubelet[2781]: E0321 13:32:19.646382 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.646409 kubelet[2781]: W0321 13:32:19.646396 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.646644 kubelet[2781]: E0321 13:32:19.646616 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.646735 kubelet[2781]: E0321 13:32:19.646717 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.646735 kubelet[2781]: W0321 13:32:19.646731 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.646817 kubelet[2781]: E0321 13:32:19.646746 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.647487 kubelet[2781]: E0321 13:32:19.647092 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.647487 kubelet[2781]: W0321 13:32:19.647106 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.647487 kubelet[2781]: E0321 13:32:19.647130 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.647940 kubelet[2781]: E0321 13:32:19.647921 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.647940 kubelet[2781]: W0321 13:32:19.647937 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.648018 kubelet[2781]: E0321 13:32:19.647959 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.648885 kubelet[2781]: E0321 13:32:19.648766 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.648885 kubelet[2781]: W0321 13:32:19.648780 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.648885 kubelet[2781]: E0321 13:32:19.648798 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.649583 kubelet[2781]: E0321 13:32:19.649254 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.649583 kubelet[2781]: W0321 13:32:19.649266 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.649583 kubelet[2781]: E0321 13:32:19.649281 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.649915 kubelet[2781]: E0321 13:32:19.649812 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.650122 kubelet[2781]: W0321 13:32:19.649970 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.650122 kubelet[2781]: E0321 13:32:19.650003 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.650296 kubelet[2781]: E0321 13:32:19.650216 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.650296 kubelet[2781]: W0321 13:32:19.650234 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.650296 kubelet[2781]: E0321 13:32:19.650251 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.650545 kubelet[2781]: E0321 13:32:19.650511 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.650545 kubelet[2781]: W0321 13:32:19.650543 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.650619 kubelet[2781]: E0321 13:32:19.650553 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.746999 kubelet[2781]: E0321 13:32:19.746889 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.746999 kubelet[2781]: W0321 13:32:19.746915 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.746999 kubelet[2781]: E0321 13:32:19.746937 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.748150 kubelet[2781]: E0321 13:32:19.747672 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.748150 kubelet[2781]: W0321 13:32:19.747685 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.748150 kubelet[2781]: E0321 13:32:19.747695 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.748413 kubelet[2781]: E0321 13:32:19.748266 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.748413 kubelet[2781]: W0321 13:32:19.748275 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.748413 kubelet[2781]: E0321 13:32:19.748285 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.748842 kubelet[2781]: E0321 13:32:19.748727 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.748842 kubelet[2781]: W0321 13:32:19.748740 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.748842 kubelet[2781]: E0321 13:32:19.748750 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.749140 kubelet[2781]: E0321 13:32:19.749022 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.749140 kubelet[2781]: W0321 13:32:19.749035 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.749140 kubelet[2781]: E0321 13:32:19.749048 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.749293 kubelet[2781]: E0321 13:32:19.749283 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.749556 kubelet[2781]: W0321 13:32:19.749348 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.749556 kubelet[2781]: E0321 13:32:19.749367 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.789088 kubelet[2781]: E0321 13:32:19.788532 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.789088 kubelet[2781]: W0321 13:32:19.788559 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.789088 kubelet[2781]: E0321 13:32:19.788585 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.794343 kubelet[2781]: E0321 13:32:19.794076 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.794343 kubelet[2781]: W0321 13:32:19.794108 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.794343 kubelet[2781]: E0321 13:32:19.794128 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.801809 kubelet[2781]: E0321 13:32:19.801597 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.801809 kubelet[2781]: W0321 13:32:19.801618 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.801809 kubelet[2781]: E0321 13:32:19.801636 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.850666 kubelet[2781]: E0321 13:32:19.850630 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.850666 kubelet[2781]: W0321 13:32:19.850649 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.850666 kubelet[2781]: E0321 13:32:19.850667 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.851262 kubelet[2781]: E0321 13:32:19.850870 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.851262 kubelet[2781]: W0321 13:32:19.850880 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.851262 kubelet[2781]: E0321 13:32:19.850889 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.851262 kubelet[2781]: E0321 13:32:19.851090 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.851262 kubelet[2781]: W0321 13:32:19.851114 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.851262 kubelet[2781]: E0321 13:32:19.851123 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.933382 kubelet[2781]: E0321 13:32:19.933182 2781 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 21 13:32:19.933382 kubelet[2781]: E0321 13:32:19.933217 2781 secret.go:194] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Mar 21 13:32:19.933382 kubelet[2781]: E0321 13:32:19.933335 2781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e2576259-a3da-4790-bcdc-f3c858add0a5-tigera-ca-bundle podName:e2576259-a3da-4790-bcdc-f3c858add0a5 nodeName:}" failed. No retries permitted until 2025-03-21 13:32:20.433293223 +0000 UTC m=+24.359511664 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/e2576259-a3da-4790-bcdc-f3c858add0a5-tigera-ca-bundle") pod "calico-typha-7c5f65576d-bhkrq" (UID: "e2576259-a3da-4790-bcdc-f3c858add0a5") : failed to sync configmap cache: timed out waiting for the condition Mar 21 13:32:19.933642 kubelet[2781]: E0321 13:32:19.933418 2781 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2576259-a3da-4790-bcdc-f3c858add0a5-typha-certs podName:e2576259-a3da-4790-bcdc-f3c858add0a5 nodeName:}" failed. No retries permitted until 2025-03-21 13:32:20.433395423 +0000 UTC m=+24.359613854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/e2576259-a3da-4790-bcdc-f3c858add0a5-typha-certs") pod "calico-typha-7c5f65576d-bhkrq" (UID: "e2576259-a3da-4790-bcdc-f3c858add0a5") : failed to sync secret cache: timed out waiting for the condition Mar 21 13:32:19.952557 kubelet[2781]: E0321 13:32:19.952368 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.952557 kubelet[2781]: W0321 13:32:19.952392 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.952557 kubelet[2781]: E0321 13:32:19.952412 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.953038 kubelet[2781]: E0321 13:32:19.952773 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.953038 kubelet[2781]: W0321 13:32:19.952783 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.953038 kubelet[2781]: E0321 13:32:19.952792 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:19.953220 kubelet[2781]: E0321 13:32:19.953205 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:19.953327 kubelet[2781]: W0321 13:32:19.953284 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:19.953327 kubelet[2781]: E0321 13:32:19.953299 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.054905 kubelet[2781]: E0321 13:32:20.054856 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.055063 kubelet[2781]: W0321 13:32:20.054908 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.055063 kubelet[2781]: E0321 13:32:20.054956 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.055535 kubelet[2781]: E0321 13:32:20.055433 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.055590 kubelet[2781]: W0321 13:32:20.055547 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.055623 kubelet[2781]: E0321 13:32:20.055585 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.056356 kubelet[2781]: E0321 13:32:20.056102 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.056356 kubelet[2781]: W0321 13:32:20.056145 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.056356 kubelet[2781]: E0321 13:32:20.056179 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.158370 kubelet[2781]: E0321 13:32:20.158022 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.158370 kubelet[2781]: W0321 13:32:20.158071 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.158370 kubelet[2781]: E0321 13:32:20.158107 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.159328 kubelet[2781]: E0321 13:32:20.159097 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.159328 kubelet[2781]: W0321 13:32:20.159125 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.159328 kubelet[2781]: E0321 13:32:20.159149 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.160138 kubelet[2781]: E0321 13:32:20.160000 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.160138 kubelet[2781]: W0321 13:32:20.160032 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.160138 kubelet[2781]: E0321 13:32:20.160056 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.261935 kubelet[2781]: E0321 13:32:20.261381 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.264555 kubelet[2781]: W0321 13:32:20.262590 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.264555 kubelet[2781]: E0321 13:32:20.262645 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.266401 kubelet[2781]: E0321 13:32:20.264729 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.266401 kubelet[2781]: W0321 13:32:20.264752 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.266401 kubelet[2781]: E0321 13:32:20.264777 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.267186 kubelet[2781]: E0321 13:32:20.266788 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.267186 kubelet[2781]: W0321 13:32:20.266811 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.267186 kubelet[2781]: E0321 13:32:20.266835 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.368847 kubelet[2781]: E0321 13:32:20.368745 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.368847 kubelet[2781]: W0321 13:32:20.368794 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.369705 kubelet[2781]: E0321 13:32:20.369246 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.370504 kubelet[2781]: E0321 13:32:20.370275 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.370504 kubelet[2781]: W0321 13:32:20.370307 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.370504 kubelet[2781]: E0321 13:32:20.370331 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.371474 kubelet[2781]: E0321 13:32:20.371254 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.371474 kubelet[2781]: W0321 13:32:20.371281 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.371474 kubelet[2781]: E0321 13:32:20.371342 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.454398 kubelet[2781]: E0321 13:32:20.454374 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.454615 kubelet[2781]: W0321 13:32:20.454562 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.454615 kubelet[2781]: E0321 13:32:20.454586 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.472160 kubelet[2781]: E0321 13:32:20.472128 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.472460 kubelet[2781]: W0321 13:32:20.472333 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.472460 kubelet[2781]: E0321 13:32:20.472361 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.472814 kubelet[2781]: E0321 13:32:20.472716 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.472814 kubelet[2781]: W0321 13:32:20.472728 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.472814 kubelet[2781]: E0321 13:32:20.472754 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.473237 kubelet[2781]: E0321 13:32:20.473125 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.473237 kubelet[2781]: W0321 13:32:20.473137 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.473237 kubelet[2781]: E0321 13:32:20.473155 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.473563 kubelet[2781]: E0321 13:32:20.473514 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.473563 kubelet[2781]: W0321 13:32:20.473526 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.473563 kubelet[2781]: E0321 13:32:20.473543 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.473983 kubelet[2781]: E0321 13:32:20.473749 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.473983 kubelet[2781]: W0321 13:32:20.473770 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.473983 kubelet[2781]: E0321 13:32:20.473797 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.475607 kubelet[2781]: E0321 13:32:20.475511 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.475607 kubelet[2781]: W0321 13:32:20.475524 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.475607 kubelet[2781]: E0321 13:32:20.475542 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.475960 kubelet[2781]: E0321 13:32:20.475904 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.475960 kubelet[2781]: W0321 13:32:20.475914 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.475960 kubelet[2781]: E0321 13:32:20.475945 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.478683 kubelet[2781]: E0321 13:32:20.478575 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.478683 kubelet[2781]: W0321 13:32:20.478590 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.478683 kubelet[2781]: E0321 13:32:20.478602 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.479054 kubelet[2781]: E0321 13:32:20.478905 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.479054 kubelet[2781]: W0321 13:32:20.478917 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.479054 kubelet[2781]: E0321 13:32:20.478944 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.479233 kubelet[2781]: E0321 13:32:20.479223 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.479452 kubelet[2781]: W0321 13:32:20.479284 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.479452 kubelet[2781]: E0321 13:32:20.479308 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.479583 kubelet[2781]: E0321 13:32:20.479561 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.479583 kubelet[2781]: W0321 13:32:20.479578 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.479647 kubelet[2781]: E0321 13:32:20.479592 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.489485 kubelet[2781]: E0321 13:32:20.489405 2781 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 21 13:32:20.489694 kubelet[2781]: W0321 13:32:20.489430 2781 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 21 13:32:20.489694 kubelet[2781]: E0321 13:32:20.489654 2781 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 21 13:32:20.595595 containerd[1480]: time="2025-03-21T13:32:20.593674308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bcwqc,Uid:0d71416e-2d02-4a15-9098-811bf718db77,Namespace:calico-system,Attempt:0,}" Mar 21 13:32:20.631924 containerd[1480]: time="2025-03-21T13:32:20.631756594Z" level=info msg="connecting to shim d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b" address="unix:///run/containerd/s/3f9496020f474f6cf514eb34d87107a7c871b58013d637a426f6ab6bd44a3736" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:20.652568 containerd[1480]: time="2025-03-21T13:32:20.652198160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c5f65576d-bhkrq,Uid:e2576259-a3da-4790-bcdc-f3c858add0a5,Namespace:calico-system,Attempt:0,}" Mar 21 13:32:20.677132 systemd[1]: Started cri-containerd-d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b.scope - libcontainer container d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b. Mar 21 13:32:20.686467 containerd[1480]: time="2025-03-21T13:32:20.686134885Z" level=info msg="connecting to shim 5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a" address="unix:///run/containerd/s/93c6ab63830fd068f66424f9fbc5f7a01f2b97d5ba5c8703f5adb780f4da63da" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:20.731640 systemd[1]: Started cri-containerd-5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a.scope - libcontainer container 5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a. Mar 21 13:32:20.736121 containerd[1480]: time="2025-03-21T13:32:20.736085496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bcwqc,Uid:0d71416e-2d02-4a15-9098-811bf718db77,Namespace:calico-system,Attempt:0,} returns sandbox id \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\"" Mar 21 13:32:20.738493 containerd[1480]: time="2025-03-21T13:32:20.738467430Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 21 13:32:20.815471 containerd[1480]: time="2025-03-21T13:32:20.815119140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c5f65576d-bhkrq,Uid:e2576259-a3da-4790-bcdc-f3c858add0a5,Namespace:calico-system,Attempt:0,} returns sandbox id \"5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a\"" Mar 21 13:32:21.189523 kubelet[2781]: E0321 13:32:21.188936 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:22.881970 containerd[1480]: time="2025-03-21T13:32:22.881260802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:22.884260 containerd[1480]: time="2025-03-21T13:32:22.884202943Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 21 13:32:22.885999 containerd[1480]: time="2025-03-21T13:32:22.885923784Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:22.890303 containerd[1480]: time="2025-03-21T13:32:22.890244668Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:22.893452 containerd[1480]: time="2025-03-21T13:32:22.893273130Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 2.154047547s" Mar 21 13:32:22.893452 containerd[1480]: time="2025-03-21T13:32:22.893317603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 21 13:32:22.895212 containerd[1480]: time="2025-03-21T13:32:22.894688241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 21 13:32:22.898522 containerd[1480]: time="2025-03-21T13:32:22.897150998Z" level=info msg="CreateContainer within sandbox \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 21 13:32:22.914495 containerd[1480]: time="2025-03-21T13:32:22.913024924Z" level=info msg="Container 96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:22.936643 containerd[1480]: time="2025-03-21T13:32:22.936001105Z" level=info msg="CreateContainer within sandbox \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\"" Mar 21 13:32:22.938170 containerd[1480]: time="2025-03-21T13:32:22.938023189Z" level=info msg="StartContainer for \"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\"" Mar 21 13:32:22.943479 containerd[1480]: time="2025-03-21T13:32:22.942510965Z" level=info msg="connecting to shim 96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7" address="unix:///run/containerd/s/3f9496020f474f6cf514eb34d87107a7c871b58013d637a426f6ab6bd44a3736" protocol=ttrpc version=3 Mar 21 13:32:22.982713 systemd[1]: Started cri-containerd-96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7.scope - libcontainer container 96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7. Mar 21 13:32:23.035122 containerd[1480]: time="2025-03-21T13:32:23.035057808Z" level=info msg="StartContainer for \"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\" returns successfully" Mar 21 13:32:23.045698 systemd[1]: cri-containerd-96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7.scope: Deactivated successfully. Mar 21 13:32:23.049367 containerd[1480]: time="2025-03-21T13:32:23.049323070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\" id:\"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\" pid:3333 exited_at:{seconds:1742563943 nanos:48765840}" Mar 21 13:32:23.049490 containerd[1480]: time="2025-03-21T13:32:23.049465867Z" level=info msg="received exit event container_id:\"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\" id:\"96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7\" pid:3333 exited_at:{seconds:1742563943 nanos:48765840}" Mar 21 13:32:23.073857 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7-rootfs.mount: Deactivated successfully. Mar 21 13:32:23.191758 kubelet[2781]: E0321 13:32:23.188288 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:25.190061 kubelet[2781]: E0321 13:32:25.189951 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:26.286302 containerd[1480]: time="2025-03-21T13:32:26.286235265Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:26.287716 containerd[1480]: time="2025-03-21T13:32:26.287543890Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 21 13:32:26.290289 containerd[1480]: time="2025-03-21T13:32:26.289159127Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:26.291472 containerd[1480]: time="2025-03-21T13:32:26.291280751Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:26.292338 containerd[1480]: time="2025-03-21T13:32:26.291886824Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.396563167s" Mar 21 13:32:26.292338 containerd[1480]: time="2025-03-21T13:32:26.291930035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 21 13:32:26.293689 containerd[1480]: time="2025-03-21T13:32:26.293504808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 21 13:32:26.308087 containerd[1480]: time="2025-03-21T13:32:26.308008877Z" level=info msg="CreateContainer within sandbox \"5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 21 13:32:26.321426 containerd[1480]: time="2025-03-21T13:32:26.320579154Z" level=info msg="Container dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:26.336669 containerd[1480]: time="2025-03-21T13:32:26.336560754Z" level=info msg="CreateContainer within sandbox \"5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada\"" Mar 21 13:32:26.337549 containerd[1480]: time="2025-03-21T13:32:26.337472357Z" level=info msg="StartContainer for \"dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada\"" Mar 21 13:32:26.338725 containerd[1480]: time="2025-03-21T13:32:26.338693809Z" level=info msg="connecting to shim dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada" address="unix:///run/containerd/s/93c6ab63830fd068f66424f9fbc5f7a01f2b97d5ba5c8703f5adb780f4da63da" protocol=ttrpc version=3 Mar 21 13:32:26.364594 systemd[1]: Started cri-containerd-dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada.scope - libcontainer container dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada. Mar 21 13:32:26.430531 containerd[1480]: time="2025-03-21T13:32:26.430004887Z" level=info msg="StartContainer for \"dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada\" returns successfully" Mar 21 13:32:27.189352 kubelet[2781]: E0321 13:32:27.189269 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:27.407388 kubelet[2781]: I0321 13:32:27.406055 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7c5f65576d-bhkrq" podStartSLOduration=3.929523135 podStartE2EDuration="9.405854699s" podCreationTimestamp="2025-03-21 13:32:18 +0000 UTC" firstStartedPulling="2025-03-21 13:32:20.816535081 +0000 UTC m=+24.742753462" lastFinishedPulling="2025-03-21 13:32:26.292866645 +0000 UTC m=+30.219085026" observedRunningTime="2025-03-21 13:32:27.37904208 +0000 UTC m=+31.305260521" watchObservedRunningTime="2025-03-21 13:32:27.405854699 +0000 UTC m=+31.332073130" Mar 21 13:32:29.189146 kubelet[2781]: E0321 13:32:29.189066 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:31.188208 kubelet[2781]: E0321 13:32:31.188103 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:32.663008 containerd[1480]: time="2025-03-21T13:32:32.662965557Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:32.664471 containerd[1480]: time="2025-03-21T13:32:32.664314180Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 21 13:32:32.665760 containerd[1480]: time="2025-03-21T13:32:32.665719239Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:32.668269 containerd[1480]: time="2025-03-21T13:32:32.668174982Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:32.668933 containerd[1480]: time="2025-03-21T13:32:32.668905630Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 6.375373833s" Mar 21 13:32:32.668984 containerd[1480]: time="2025-03-21T13:32:32.668935014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 21 13:32:32.672479 containerd[1480]: time="2025-03-21T13:32:32.672387022Z" level=info msg="CreateContainer within sandbox \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 21 13:32:32.687606 containerd[1480]: time="2025-03-21T13:32:32.686629746Z" level=info msg="Container 050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:32.691240 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount512488257.mount: Deactivated successfully. Mar 21 13:32:32.704968 containerd[1480]: time="2025-03-21T13:32:32.704921534Z" level=info msg="CreateContainer within sandbox \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\"" Mar 21 13:32:32.705717 containerd[1480]: time="2025-03-21T13:32:32.705658724Z" level=info msg="StartContainer for \"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\"" Mar 21 13:32:32.707749 containerd[1480]: time="2025-03-21T13:32:32.707609232Z" level=info msg="connecting to shim 050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370" address="unix:///run/containerd/s/3f9496020f474f6cf514eb34d87107a7c871b58013d637a426f6ab6bd44a3736" protocol=ttrpc version=3 Mar 21 13:32:32.734586 systemd[1]: Started cri-containerd-050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370.scope - libcontainer container 050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370. Mar 21 13:32:32.790820 containerd[1480]: time="2025-03-21T13:32:32.790749452Z" level=info msg="StartContainer for \"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\" returns successfully" Mar 21 13:32:33.191497 kubelet[2781]: E0321 13:32:33.189595 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:34.679174 containerd[1480]: time="2025-03-21T13:32:34.679087093Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 21 13:32:34.683674 systemd[1]: cri-containerd-050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370.scope: Deactivated successfully. Mar 21 13:32:34.685994 systemd[1]: cri-containerd-050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370.scope: Consumed 725ms CPU time, 172.4M memory peak, 154M written to disk. Mar 21 13:32:34.688900 containerd[1480]: time="2025-03-21T13:32:34.687866713Z" level=info msg="received exit event container_id:\"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\" id:\"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\" pid:3434 exited_at:{seconds:1742563954 nanos:687203362}" Mar 21 13:32:34.688900 containerd[1480]: time="2025-03-21T13:32:34.688387688Z" level=info msg="TaskExit event in podsandbox handler container_id:\"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\" id:\"050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370\" pid:3434 exited_at:{seconds:1742563954 nanos:687203362}" Mar 21 13:32:34.729345 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370-rootfs.mount: Deactivated successfully. Mar 21 13:32:34.747876 kubelet[2781]: I0321 13:32:34.747812 2781 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 21 13:32:34.900428 kubelet[2781]: I0321 13:32:34.900063 2781 topology_manager.go:215] "Topology Admit Handler" podUID="ac20d49b-f5e7-4afb-8b4c-03a33917614e" podNamespace="calico-system" podName="calico-kube-controllers-5c6dcfd6c4-2svmz" Mar 21 13:32:34.910255 systemd[1]: Created slice kubepods-besteffort-podac20d49b_f5e7_4afb_8b4c_03a33917614e.slice - libcontainer container kubepods-besteffort-podac20d49b_f5e7_4afb_8b4c_03a33917614e.slice. Mar 21 13:32:34.979358 kubelet[2781]: I0321 13:32:34.979174 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2fn4\" (UniqueName: \"kubernetes.io/projected/ac20d49b-f5e7-4afb-8b4c-03a33917614e-kube-api-access-g2fn4\") pod \"calico-kube-controllers-5c6dcfd6c4-2svmz\" (UID: \"ac20d49b-f5e7-4afb-8b4c-03a33917614e\") " pod="calico-system/calico-kube-controllers-5c6dcfd6c4-2svmz" Mar 21 13:32:34.979358 kubelet[2781]: I0321 13:32:34.979217 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac20d49b-f5e7-4afb-8b4c-03a33917614e-tigera-ca-bundle\") pod \"calico-kube-controllers-5c6dcfd6c4-2svmz\" (UID: \"ac20d49b-f5e7-4afb-8b4c-03a33917614e\") " pod="calico-system/calico-kube-controllers-5c6dcfd6c4-2svmz" Mar 21 13:32:35.092625 kubelet[2781]: I0321 13:32:35.090831 2781 topology_manager.go:215] "Topology Admit Handler" podUID="bad9e428-aaad-4096-9b87-4071585b40fd" podNamespace="kube-system" podName="coredns-7db6d8ff4d-nf9ts" Mar 21 13:32:35.113750 kubelet[2781]: I0321 13:32:35.113669 2781 topology_manager.go:215] "Topology Admit Handler" podUID="c18e3485-1786-4c2c-91d4-a50eb1c579da" podNamespace="calico-apiserver" podName="calico-apiserver-7d8f557dbd-b5fxs" Mar 21 13:32:35.118836 systemd[1]: Created slice kubepods-burstable-podbad9e428_aaad_4096_9b87_4071585b40fd.slice - libcontainer container kubepods-burstable-podbad9e428_aaad_4096_9b87_4071585b40fd.slice. Mar 21 13:32:35.145599 systemd[1]: Created slice kubepods-besteffort-podc18e3485_1786_4c2c_91d4_a50eb1c579da.slice - libcontainer container kubepods-besteffort-podc18e3485_1786_4c2c_91d4_a50eb1c579da.slice. Mar 21 13:32:35.151706 kubelet[2781]: I0321 13:32:35.147602 2781 topology_manager.go:215] "Topology Admit Handler" podUID="a6949731-d87b-46e5-9f9a-911cd33e8de2" podNamespace="calico-apiserver" podName="calico-apiserver-7d8f557dbd-4qzd9" Mar 21 13:32:35.158285 kubelet[2781]: I0321 13:32:35.157484 2781 topology_manager.go:215] "Topology Admit Handler" podUID="f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad" podNamespace="kube-system" podName="coredns-7db6d8ff4d-pg92t" Mar 21 13:32:35.181559 kubelet[2781]: I0321 13:32:35.180891 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb4wj\" (UniqueName: \"kubernetes.io/projected/c18e3485-1786-4c2c-91d4-a50eb1c579da-kube-api-access-lb4wj\") pod \"calico-apiserver-7d8f557dbd-b5fxs\" (UID: \"c18e3485-1786-4c2c-91d4-a50eb1c579da\") " pod="calico-apiserver/calico-apiserver-7d8f557dbd-b5fxs" Mar 21 13:32:35.181559 kubelet[2781]: I0321 13:32:35.180928 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c18e3485-1786-4c2c-91d4-a50eb1c579da-calico-apiserver-certs\") pod \"calico-apiserver-7d8f557dbd-b5fxs\" (UID: \"c18e3485-1786-4c2c-91d4-a50eb1c579da\") " pod="calico-apiserver/calico-apiserver-7d8f557dbd-b5fxs" Mar 21 13:32:35.181559 kubelet[2781]: I0321 13:32:35.180950 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a6949731-d87b-46e5-9f9a-911cd33e8de2-calico-apiserver-certs\") pod \"calico-apiserver-7d8f557dbd-4qzd9\" (UID: \"a6949731-d87b-46e5-9f9a-911cd33e8de2\") " pod="calico-apiserver/calico-apiserver-7d8f557dbd-4qzd9" Mar 21 13:32:35.181559 kubelet[2781]: I0321 13:32:35.180972 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bad9e428-aaad-4096-9b87-4071585b40fd-config-volume\") pod \"coredns-7db6d8ff4d-nf9ts\" (UID: \"bad9e428-aaad-4096-9b87-4071585b40fd\") " pod="kube-system/coredns-7db6d8ff4d-nf9ts" Mar 21 13:32:35.181559 kubelet[2781]: I0321 13:32:35.180991 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad-config-volume\") pod \"coredns-7db6d8ff4d-pg92t\" (UID: \"f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad\") " pod="kube-system/coredns-7db6d8ff4d-pg92t" Mar 21 13:32:35.181773 kubelet[2781]: I0321 13:32:35.181026 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pghv\" (UniqueName: \"kubernetes.io/projected/f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad-kube-api-access-4pghv\") pod \"coredns-7db6d8ff4d-pg92t\" (UID: \"f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad\") " pod="kube-system/coredns-7db6d8ff4d-pg92t" Mar 21 13:32:35.181773 kubelet[2781]: I0321 13:32:35.181047 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv2md\" (UniqueName: \"kubernetes.io/projected/bad9e428-aaad-4096-9b87-4071585b40fd-kube-api-access-vv2md\") pod \"coredns-7db6d8ff4d-nf9ts\" (UID: \"bad9e428-aaad-4096-9b87-4071585b40fd\") " pod="kube-system/coredns-7db6d8ff4d-nf9ts" Mar 21 13:32:35.181773 kubelet[2781]: I0321 13:32:35.181066 2781 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2qr7\" (UniqueName: \"kubernetes.io/projected/a6949731-d87b-46e5-9f9a-911cd33e8de2-kube-api-access-l2qr7\") pod \"calico-apiserver-7d8f557dbd-4qzd9\" (UID: \"a6949731-d87b-46e5-9f9a-911cd33e8de2\") " pod="calico-apiserver/calico-apiserver-7d8f557dbd-4qzd9" Mar 21 13:32:35.183166 systemd[1]: Created slice kubepods-besteffort-poda6949731_d87b_46e5_9f9a_911cd33e8de2.slice - libcontainer container kubepods-besteffort-poda6949731_d87b_46e5_9f9a_911cd33e8de2.slice. Mar 21 13:32:35.191562 systemd[1]: Created slice kubepods-burstable-podf24e682f_5b8e_4e9a_ba10_e1b043d7c2ad.slice - libcontainer container kubepods-burstable-podf24e682f_5b8e_4e9a_ba10_e1b043d7c2ad.slice. Mar 21 13:32:35.199597 systemd[1]: Created slice kubepods-besteffort-pod853e0129_1988_42f8_a91f_ebe3cb4d5800.slice - libcontainer container kubepods-besteffort-pod853e0129_1988_42f8_a91f_ebe3cb4d5800.slice. Mar 21 13:32:35.205532 containerd[1480]: time="2025-03-21T13:32:35.204206622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hh6d2,Uid:853e0129-1988-42f8-a91f-ebe3cb4d5800,Namespace:calico-system,Attempt:0,}" Mar 21 13:32:35.214715 containerd[1480]: time="2025-03-21T13:32:35.214647172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6dcfd6c4-2svmz,Uid:ac20d49b-f5e7-4afb-8b4c-03a33917614e,Namespace:calico-system,Attempt:0,}" Mar 21 13:32:35.339685 containerd[1480]: time="2025-03-21T13:32:35.339558278Z" level=error msg="Failed to destroy network for sandbox \"558abec20fbce177b015c57615bf5b49f4b0c719e2dcfabab52e88ae89241953\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.342986 containerd[1480]: time="2025-03-21T13:32:35.342867371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hh6d2,Uid:853e0129-1988-42f8-a91f-ebe3cb4d5800,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"558abec20fbce177b015c57615bf5b49f4b0c719e2dcfabab52e88ae89241953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.344585 kubelet[2781]: E0321 13:32:35.343190 2781 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"558abec20fbce177b015c57615bf5b49f4b0c719e2dcfabab52e88ae89241953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.344585 kubelet[2781]: E0321 13:32:35.343256 2781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"558abec20fbce177b015c57615bf5b49f4b0c719e2dcfabab52e88ae89241953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:35.344585 kubelet[2781]: E0321 13:32:35.343279 2781 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"558abec20fbce177b015c57615bf5b49f4b0c719e2dcfabab52e88ae89241953\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hh6d2" Mar 21 13:32:35.344725 kubelet[2781]: E0321 13:32:35.343318 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hh6d2_calico-system(853e0129-1988-42f8-a91f-ebe3cb4d5800)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hh6d2_calico-system(853e0129-1988-42f8-a91f-ebe3cb4d5800)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"558abec20fbce177b015c57615bf5b49f4b0c719e2dcfabab52e88ae89241953\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hh6d2" podUID="853e0129-1988-42f8-a91f-ebe3cb4d5800" Mar 21 13:32:35.349105 containerd[1480]: time="2025-03-21T13:32:35.349045184Z" level=error msg="Failed to destroy network for sandbox \"6a6ce3aa0ac95041b3a2af1af086abcc09f69008f0a4151ac8ddda76cf90a72f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.350893 containerd[1480]: time="2025-03-21T13:32:35.350847216Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6dcfd6c4-2svmz,Uid:ac20d49b-f5e7-4afb-8b4c-03a33917614e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6ce3aa0ac95041b3a2af1af086abcc09f69008f0a4151ac8ddda76cf90a72f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.351105 kubelet[2781]: E0321 13:32:35.351075 2781 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6ce3aa0ac95041b3a2af1af086abcc09f69008f0a4151ac8ddda76cf90a72f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.351340 kubelet[2781]: E0321 13:32:35.351191 2781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6ce3aa0ac95041b3a2af1af086abcc09f69008f0a4151ac8ddda76cf90a72f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c6dcfd6c4-2svmz" Mar 21 13:32:35.351340 kubelet[2781]: E0321 13:32:35.351217 2781 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6ce3aa0ac95041b3a2af1af086abcc09f69008f0a4151ac8ddda76cf90a72f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c6dcfd6c4-2svmz" Mar 21 13:32:35.351340 kubelet[2781]: E0321 13:32:35.351274 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5c6dcfd6c4-2svmz_calico-system(ac20d49b-f5e7-4afb-8b4c-03a33917614e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5c6dcfd6c4-2svmz_calico-system(ac20d49b-f5e7-4afb-8b4c-03a33917614e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a6ce3aa0ac95041b3a2af1af086abcc09f69008f0a4151ac8ddda76cf90a72f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5c6dcfd6c4-2svmz" podUID="ac20d49b-f5e7-4afb-8b4c-03a33917614e" Mar 21 13:32:35.373920 containerd[1480]: time="2025-03-21T13:32:35.373876847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 21 13:32:35.430364 containerd[1480]: time="2025-03-21T13:32:35.430177336Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nf9ts,Uid:bad9e428-aaad-4096-9b87-4071585b40fd,Namespace:kube-system,Attempt:0,}" Mar 21 13:32:35.465681 containerd[1480]: time="2025-03-21T13:32:35.465564814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-b5fxs,Uid:c18e3485-1786-4c2c-91d4-a50eb1c579da,Namespace:calico-apiserver,Attempt:0,}" Mar 21 13:32:35.488115 containerd[1480]: time="2025-03-21T13:32:35.488056849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-4qzd9,Uid:a6949731-d87b-46e5-9f9a-911cd33e8de2,Namespace:calico-apiserver,Attempt:0,}" Mar 21 13:32:35.505034 containerd[1480]: time="2025-03-21T13:32:35.504657818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pg92t,Uid:f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad,Namespace:kube-system,Attempt:0,}" Mar 21 13:32:35.534821 containerd[1480]: time="2025-03-21T13:32:35.534737364Z" level=error msg="Failed to destroy network for sandbox \"6f3ff44149e100d4945536b9e67c9c4c40f00705cfaed0871b99fdfebe527967\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.538710 containerd[1480]: time="2025-03-21T13:32:35.538132388Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nf9ts,Uid:bad9e428-aaad-4096-9b87-4071585b40fd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3ff44149e100d4945536b9e67c9c4c40f00705cfaed0871b99fdfebe527967\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.538842 kubelet[2781]: E0321 13:32:35.538365 2781 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3ff44149e100d4945536b9e67c9c4c40f00705cfaed0871b99fdfebe527967\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.538842 kubelet[2781]: E0321 13:32:35.538457 2781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3ff44149e100d4945536b9e67c9c4c40f00705cfaed0871b99fdfebe527967\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-nf9ts" Mar 21 13:32:35.538842 kubelet[2781]: E0321 13:32:35.538483 2781 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f3ff44149e100d4945536b9e67c9c4c40f00705cfaed0871b99fdfebe527967\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-nf9ts" Mar 21 13:32:35.538936 kubelet[2781]: E0321 13:32:35.538562 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-nf9ts_kube-system(bad9e428-aaad-4096-9b87-4071585b40fd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-nf9ts_kube-system(bad9e428-aaad-4096-9b87-4071585b40fd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f3ff44149e100d4945536b9e67c9c4c40f00705cfaed0871b99fdfebe527967\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-nf9ts" podUID="bad9e428-aaad-4096-9b87-4071585b40fd" Mar 21 13:32:35.568444 containerd[1480]: time="2025-03-21T13:32:35.568170356Z" level=error msg="Failed to destroy network for sandbox \"9c24ccb2340d9114e3b9dbeefe2d1f7809ad7f8bf00cb86d4e61c09de3260b15\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.570666 containerd[1480]: time="2025-03-21T13:32:35.570122068Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-b5fxs,Uid:c18e3485-1786-4c2c-91d4-a50eb1c579da,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c24ccb2340d9114e3b9dbeefe2d1f7809ad7f8bf00cb86d4e61c09de3260b15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.570769 kubelet[2781]: E0321 13:32:35.570316 2781 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c24ccb2340d9114e3b9dbeefe2d1f7809ad7f8bf00cb86d4e61c09de3260b15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.570769 kubelet[2781]: E0321 13:32:35.570367 2781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c24ccb2340d9114e3b9dbeefe2d1f7809ad7f8bf00cb86d4e61c09de3260b15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f557dbd-b5fxs" Mar 21 13:32:35.570769 kubelet[2781]: E0321 13:32:35.570389 2781 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c24ccb2340d9114e3b9dbeefe2d1f7809ad7f8bf00cb86d4e61c09de3260b15\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f557dbd-b5fxs" Mar 21 13:32:35.570870 kubelet[2781]: E0321 13:32:35.570428 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d8f557dbd-b5fxs_calico-apiserver(c18e3485-1786-4c2c-91d4-a50eb1c579da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d8f557dbd-b5fxs_calico-apiserver(c18e3485-1786-4c2c-91d4-a50eb1c579da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c24ccb2340d9114e3b9dbeefe2d1f7809ad7f8bf00cb86d4e61c09de3260b15\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d8f557dbd-b5fxs" podUID="c18e3485-1786-4c2c-91d4-a50eb1c579da" Mar 21 13:32:35.593480 containerd[1480]: time="2025-03-21T13:32:35.593367784Z" level=error msg="Failed to destroy network for sandbox \"f4780c10bdfeb9df7f897a3c9130e67873e8840db254c769866474c8e6f68488\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.595191 containerd[1480]: time="2025-03-21T13:32:35.595137265Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pg92t,Uid:f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4780c10bdfeb9df7f897a3c9130e67873e8840db254c769866474c8e6f68488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.596480 containerd[1480]: time="2025-03-21T13:32:35.596174577Z" level=error msg="Failed to destroy network for sandbox \"2640c5b1c9782fc9aafe5045f3800de111cc9ca5cb97dcede2b24f15dc6bd29f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.596545 kubelet[2781]: E0321 13:32:35.596292 2781 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4780c10bdfeb9df7f897a3c9130e67873e8840db254c769866474c8e6f68488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.596545 kubelet[2781]: E0321 13:32:35.596363 2781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4780c10bdfeb9df7f897a3c9130e67873e8840db254c769866474c8e6f68488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pg92t" Mar 21 13:32:35.596545 kubelet[2781]: E0321 13:32:35.596383 2781 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4780c10bdfeb9df7f897a3c9130e67873e8840db254c769866474c8e6f68488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-pg92t" Mar 21 13:32:35.596684 kubelet[2781]: E0321 13:32:35.596428 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-pg92t_kube-system(f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-pg92t_kube-system(f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4780c10bdfeb9df7f897a3c9130e67873e8840db254c769866474c8e6f68488\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-pg92t" podUID="f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad" Mar 21 13:32:35.597920 containerd[1480]: time="2025-03-21T13:32:35.597829063Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-4qzd9,Uid:a6949731-d87b-46e5-9f9a-911cd33e8de2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2640c5b1c9782fc9aafe5045f3800de111cc9ca5cb97dcede2b24f15dc6bd29f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.598472 kubelet[2781]: E0321 13:32:35.598088 2781 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2640c5b1c9782fc9aafe5045f3800de111cc9ca5cb97dcede2b24f15dc6bd29f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 21 13:32:35.598472 kubelet[2781]: E0321 13:32:35.598135 2781 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2640c5b1c9782fc9aafe5045f3800de111cc9ca5cb97dcede2b24f15dc6bd29f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f557dbd-4qzd9" Mar 21 13:32:35.598650 kubelet[2781]: E0321 13:32:35.598158 2781 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2640c5b1c9782fc9aafe5045f3800de111cc9ca5cb97dcede2b24f15dc6bd29f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d8f557dbd-4qzd9" Mar 21 13:32:35.598650 kubelet[2781]: E0321 13:32:35.598613 2781 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d8f557dbd-4qzd9_calico-apiserver(a6949731-d87b-46e5-9f9a-911cd33e8de2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d8f557dbd-4qzd9_calico-apiserver(a6949731-d87b-46e5-9f9a-911cd33e8de2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2640c5b1c9782fc9aafe5045f3800de111cc9ca5cb97dcede2b24f15dc6bd29f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d8f557dbd-4qzd9" podUID="a6949731-d87b-46e5-9f9a-911cd33e8de2" Mar 21 13:32:35.752637 systemd[1]: run-netns-cni\x2d8bbde809\x2d5a3c\x2d0db5\x2d2a17\x2d80f7765f576c.mount: Deactivated successfully. Mar 21 13:32:35.752869 systemd[1]: run-netns-cni\x2d00f2728f\x2d969c\x2db2f2\x2d0732\x2d21c533b410b7.mount: Deactivated successfully. Mar 21 13:32:43.827227 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount987617116.mount: Deactivated successfully. Mar 21 13:32:44.310610 containerd[1480]: time="2025-03-21T13:32:44.310531995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:44.312027 containerd[1480]: time="2025-03-21T13:32:44.311985488Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 21 13:32:44.313635 containerd[1480]: time="2025-03-21T13:32:44.313393824Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:44.316577 containerd[1480]: time="2025-03-21T13:32:44.316487070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:44.318450 containerd[1480]: time="2025-03-21T13:32:44.317082559Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 8.943169113s" Mar 21 13:32:44.318450 containerd[1480]: time="2025-03-21T13:32:44.317115241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 21 13:32:44.341045 containerd[1480]: time="2025-03-21T13:32:44.339860769Z" level=info msg="CreateContainer within sandbox \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 21 13:32:44.369549 containerd[1480]: time="2025-03-21T13:32:44.369362076Z" level=info msg="Container 1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:44.386770 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1218319341.mount: Deactivated successfully. Mar 21 13:32:44.409828 containerd[1480]: time="2025-03-21T13:32:44.409752304Z" level=info msg="CreateContainer within sandbox \"d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\"" Mar 21 13:32:44.412549 containerd[1480]: time="2025-03-21T13:32:44.411363631Z" level=info msg="StartContainer for \"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\"" Mar 21 13:32:44.421685 containerd[1480]: time="2025-03-21T13:32:44.421576458Z" level=info msg="connecting to shim 1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a" address="unix:///run/containerd/s/3f9496020f474f6cf514eb34d87107a7c871b58013d637a426f6ab6bd44a3736" protocol=ttrpc version=3 Mar 21 13:32:44.459640 systemd[1]: Started cri-containerd-1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a.scope - libcontainer container 1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a. Mar 21 13:32:44.513239 containerd[1480]: time="2025-03-21T13:32:44.513205626Z" level=info msg="StartContainer for \"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" returns successfully" Mar 21 13:32:44.582194 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 21 13:32:44.582283 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 21 13:32:45.478798 containerd[1480]: time="2025-03-21T13:32:45.478756145Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"68e28a237edbabe6ded7652533e7d57ce60cd18d403a590791c126e719890480\" pid:3730 exit_status:1 exited_at:{seconds:1742563965 nanos:478023523}" Mar 21 13:32:46.297482 kernel: bpftool[3861]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 21 13:32:46.521073 containerd[1480]: time="2025-03-21T13:32:46.521026603Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"9950ce21f9137f4b999e28b3ec7921fdaa0cb4b00566b6f00a0cb6aa6fb6792c\" pid:3874 exit_status:1 exited_at:{seconds:1742563966 nanos:520133964}" Mar 21 13:32:46.570312 systemd-networkd[1380]: vxlan.calico: Link UP Mar 21 13:32:46.570320 systemd-networkd[1380]: vxlan.calico: Gained carrier Mar 21 13:32:47.191369 containerd[1480]: time="2025-03-21T13:32:47.190692478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hh6d2,Uid:853e0129-1988-42f8-a91f-ebe3cb4d5800,Namespace:calico-system,Attempt:0,}" Mar 21 13:32:47.480064 systemd-networkd[1380]: cali0e939b7ca19: Link UP Mar 21 13:32:47.481116 systemd-networkd[1380]: cali0e939b7ca19: Gained carrier Mar 21 13:32:47.512110 kubelet[2781]: I0321 13:32:47.510303 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-bcwqc" podStartSLOduration=4.928640286 podStartE2EDuration="28.510272557s" podCreationTimestamp="2025-03-21 13:32:19 +0000 UTC" firstStartedPulling="2025-03-21 13:32:20.737887147 +0000 UTC m=+24.664105528" lastFinishedPulling="2025-03-21 13:32:44.319519358 +0000 UTC m=+48.245737799" observedRunningTime="2025-03-21 13:32:45.465665987 +0000 UTC m=+49.391884399" watchObservedRunningTime="2025-03-21 13:32:47.510272557 +0000 UTC m=+51.436490978" Mar 21 13:32:47.519312 containerd[1480]: 2025-03-21 13:32:47.332 [INFO][3952] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0 csi-node-driver- calico-system 853e0129-1988-42f8-a91f-ebe3cb4d5800 625 0 2025-03-21 13:32:19 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-9999-0-3-4-20a459a426.novalocal csi-node-driver-hh6d2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0e939b7ca19 [] []}} ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-" Mar 21 13:32:47.519312 containerd[1480]: 2025-03-21 13:32:47.337 [INFO][3952] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.519312 containerd[1480]: 2025-03-21 13:32:47.410 [INFO][3964] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" HandleID="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.426 [INFO][3964] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" HandleID="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002868b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-3-4-20a459a426.novalocal", "pod":"csi-node-driver-hh6d2", "timestamp":"2025-03-21 13:32:47.410579037 +0000 UTC"}, Hostname:"ci-9999-0-3-4-20a459a426.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.426 [INFO][3964] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.426 [INFO][3964] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.426 [INFO][3964] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-4-20a459a426.novalocal' Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.428 [INFO][3964] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.434 [INFO][3964] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.440 [INFO][3964] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.443 [INFO][3964] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.520341 containerd[1480]: 2025-03-21 13:32:47.446 [INFO][3964] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.446 [INFO][3964] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.449 [INFO][3964] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.457 [INFO][3964] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.467 [INFO][3964] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.65/26] block=192.168.91.64/26 handle="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.467 [INFO][3964] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.65/26] handle="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.467 [INFO][3964] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 13:32:47.522001 containerd[1480]: 2025-03-21 13:32:47.467 [INFO][3964] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.65/26] IPv6=[] ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" HandleID="k8s-pod-network.6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.522394 containerd[1480]: 2025-03-21 13:32:47.472 [INFO][3952] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"853e0129-1988-42f8-a91f-ebe3cb4d5800", ResourceVersion:"625", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"", Pod:"csi-node-driver-hh6d2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0e939b7ca19", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:47.524083 containerd[1480]: 2025-03-21 13:32:47.472 [INFO][3952] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.65/32] ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.524083 containerd[1480]: 2025-03-21 13:32:47.472 [INFO][3952] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e939b7ca19 ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.524083 containerd[1480]: 2025-03-21 13:32:47.482 [INFO][3952] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.524731 containerd[1480]: 2025-03-21 13:32:47.483 [INFO][3952] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"853e0129-1988-42f8-a91f-ebe3cb4d5800", ResourceVersion:"625", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a", Pod:"csi-node-driver-hh6d2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0e939b7ca19", MAC:"62:84:d9:37:08:0a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:47.525528 containerd[1480]: 2025-03-21 13:32:47.511 [INFO][3952] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" Namespace="calico-system" Pod="csi-node-driver-hh6d2" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-csi--node--driver--hh6d2-eth0" Mar 21 13:32:47.580960 containerd[1480]: time="2025-03-21T13:32:47.580888089Z" level=info msg="connecting to shim 6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a" address="unix:///run/containerd/s/75e3e23cbda5f4319172c4131f3134357252d6101f681f373adf822f8c60bab4" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:47.610636 systemd[1]: Started cri-containerd-6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a.scope - libcontainer container 6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a. Mar 21 13:32:47.637741 containerd[1480]: time="2025-03-21T13:32:47.637708595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hh6d2,Uid:853e0129-1988-42f8-a91f-ebe3cb4d5800,Namespace:calico-system,Attempt:0,} returns sandbox id \"6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a\"" Mar 21 13:32:47.639200 containerd[1480]: time="2025-03-21T13:32:47.639064824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 21 13:32:48.191806 containerd[1480]: time="2025-03-21T13:32:48.191687968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pg92t,Uid:f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad,Namespace:kube-system,Attempt:0,}" Mar 21 13:32:48.192604 containerd[1480]: time="2025-03-21T13:32:48.192185154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-4qzd9,Uid:a6949731-d87b-46e5-9f9a-911cd33e8de2,Namespace:calico-apiserver,Attempt:0,}" Mar 21 13:32:48.196687 containerd[1480]: time="2025-03-21T13:32:48.196622186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6dcfd6c4-2svmz,Uid:ac20d49b-f5e7-4afb-8b4c-03a33917614e,Namespace:calico-system,Attempt:0,}" Mar 21 13:32:48.214780 containerd[1480]: time="2025-03-21T13:32:48.214707285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-b5fxs,Uid:c18e3485-1786-4c2c-91d4-a50eb1c579da,Namespace:calico-apiserver,Attempt:0,}" Mar 21 13:32:48.232060 systemd-networkd[1380]: vxlan.calico: Gained IPv6LL Mar 21 13:32:48.600089 systemd-networkd[1380]: cali6f92f3ea1f3: Link UP Mar 21 13:32:48.602207 systemd-networkd[1380]: cali6f92f3ea1f3: Gained carrier Mar 21 13:32:48.658772 systemd-networkd[1380]: cali0e939b7ca19: Gained IPv6LL Mar 21 13:32:48.671277 containerd[1480]: 2025-03-21 13:32:48.322 [INFO][4029] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0 coredns-7db6d8ff4d- kube-system f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad 732 0 2025-03-21 13:32:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-3-4-20a459a426.novalocal coredns-7db6d8ff4d-pg92t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali6f92f3ea1f3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-" Mar 21 13:32:48.671277 containerd[1480]: 2025-03-21 13:32:48.322 [INFO][4029] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.671277 containerd[1480]: 2025-03-21 13:32:48.366 [INFO][4058] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" HandleID="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.428 [INFO][4058] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" HandleID="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d6930), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-3-4-20a459a426.novalocal", "pod":"coredns-7db6d8ff4d-pg92t", "timestamp":"2025-03-21 13:32:48.366486017 +0000 UTC"}, Hostname:"ci-9999-0-3-4-20a459a426.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.428 [INFO][4058] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.428 [INFO][4058] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.428 [INFO][4058] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-4-20a459a426.novalocal' Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.430 [INFO][4058] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.434 [INFO][4058] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.439 [INFO][4058] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.443 [INFO][4058] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.674094 containerd[1480]: 2025-03-21 13:32:48.446 [INFO][4058] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.446 [INFO][4058] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.448 [INFO][4058] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102 Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.513 [INFO][4058] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.586 [INFO][4058] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.66/26] block=192.168.91.64/26 handle="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.586 [INFO][4058] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.66/26] handle="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.586 [INFO][4058] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 13:32:48.680396 containerd[1480]: 2025-03-21 13:32:48.586 [INFO][4058] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.66/26] IPv6=[] ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" HandleID="k8s-pod-network.d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.681480 containerd[1480]: 2025-03-21 13:32:48.589 [INFO][4029] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-pg92t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6f92f3ea1f3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:48.681480 containerd[1480]: 2025-03-21 13:32:48.589 [INFO][4029] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.66/32] ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.681480 containerd[1480]: 2025-03-21 13:32:48.590 [INFO][4029] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6f92f3ea1f3 ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.681480 containerd[1480]: 2025-03-21 13:32:48.602 [INFO][4029] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.681480 containerd[1480]: 2025-03-21 13:32:48.604 [INFO][4029] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102", Pod:"coredns-7db6d8ff4d-pg92t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali6f92f3ea1f3", MAC:"5e:34:b8:1b:88:ce", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:48.681480 containerd[1480]: 2025-03-21 13:32:48.665 [INFO][4029] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" Namespace="kube-system" Pod="coredns-7db6d8ff4d-pg92t" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--pg92t-eth0" Mar 21 13:32:48.883869 systemd-networkd[1380]: calibe4e569c5f6: Link UP Mar 21 13:32:48.884684 systemd-networkd[1380]: calibe4e569c5f6: Gained carrier Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.352 [INFO][4046] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0 calico-apiserver-7d8f557dbd- calico-apiserver a6949731-d87b-46e5-9f9a-911cd33e8de2 734 0 2025-03-21 13:32:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d8f557dbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-3-4-20a459a426.novalocal calico-apiserver-7d8f557dbd-4qzd9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibe4e569c5f6 [] []}} ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.414 [INFO][4046] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.465 [INFO][4066] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" HandleID="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.516 [INFO][4066] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" HandleID="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291300), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-3-4-20a459a426.novalocal", "pod":"calico-apiserver-7d8f557dbd-4qzd9", "timestamp":"2025-03-21 13:32:48.465687028 +0000 UTC"}, Hostname:"ci-9999-0-3-4-20a459a426.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.516 [INFO][4066] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.587 [INFO][4066] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.587 [INFO][4066] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-4-20a459a426.novalocal' Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.591 [INFO][4066] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.603 [INFO][4066] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.617 [INFO][4066] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.664 [INFO][4066] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.683 [INFO][4066] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.683 [INFO][4066] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.785 [INFO][4066] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425 Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.845 [INFO][4066] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.872 [INFO][4066] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.67/26] block=192.168.91.64/26 handle="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.872 [INFO][4066] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.67/26] handle="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.873 [INFO][4066] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 13:32:48.920981 containerd[1480]: 2025-03-21 13:32:48.873 [INFO][4066] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.67/26] IPv6=[] ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" HandleID="k8s-pod-network.96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.922809 containerd[1480]: 2025-03-21 13:32:48.878 [INFO][4046] cni-plugin/k8s.go 386: Populated endpoint ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0", GenerateName:"calico-apiserver-7d8f557dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"a6949731-d87b-46e5-9f9a-911cd33e8de2", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f557dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"", Pod:"calico-apiserver-7d8f557dbd-4qzd9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe4e569c5f6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:48.922809 containerd[1480]: 2025-03-21 13:32:48.878 [INFO][4046] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.67/32] ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.922809 containerd[1480]: 2025-03-21 13:32:48.878 [INFO][4046] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibe4e569c5f6 ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.922809 containerd[1480]: 2025-03-21 13:32:48.885 [INFO][4046] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.922809 containerd[1480]: 2025-03-21 13:32:48.889 [INFO][4046] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0", GenerateName:"calico-apiserver-7d8f557dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"a6949731-d87b-46e5-9f9a-911cd33e8de2", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f557dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425", Pod:"calico-apiserver-7d8f557dbd-4qzd9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe4e569c5f6", MAC:"9a:68:71:b5:3e:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:48.922809 containerd[1480]: 2025-03-21 13:32:48.911 [INFO][4046] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-4qzd9" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--4qzd9-eth0" Mar 21 13:32:48.937600 containerd[1480]: time="2025-03-21T13:32:48.937549084Z" level=info msg="connecting to shim d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102" address="unix:///run/containerd/s/c7d8a62dbee1284f51323eef94363fa818dd516c918fbee4f12a4e3314a2d82f" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:48.989669 systemd[1]: Started cri-containerd-d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102.scope - libcontainer container d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102. Mar 21 13:32:49.018070 containerd[1480]: time="2025-03-21T13:32:49.018026125Z" level=info msg="connecting to shim 96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425" address="unix:///run/containerd/s/5a9db7f1094eef4ba880999cc3a254fd4c221c22d1aa27f25eafc97d227b585e" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:49.086212 systemd[1]: Started cri-containerd-96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425.scope - libcontainer container 96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425. Mar 21 13:32:49.097574 containerd[1480]: time="2025-03-21T13:32:49.097536478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-pg92t,Uid:f24e682f-5b8e-4e9a-ba10-e1b043d7c2ad,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102\"" Mar 21 13:32:49.143049 containerd[1480]: time="2025-03-21T13:32:49.142922640Z" level=info msg="CreateContainer within sandbox \"d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 21 13:32:49.149657 systemd-networkd[1380]: calid7c4f8fa998: Link UP Mar 21 13:32:49.149915 systemd-networkd[1380]: calid7c4f8fa998: Gained carrier Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:48.956 [INFO][4088] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0 calico-kube-controllers-5c6dcfd6c4- calico-system ac20d49b-f5e7-4afb-8b4c-03a33917614e 730 0 2025-03-21 13:32:19 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5c6dcfd6c4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-9999-0-3-4-20a459a426.novalocal calico-kube-controllers-5c6dcfd6c4-2svmz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid7c4f8fa998 [] []}} ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:48.956 [INFO][4088] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.044 [INFO][4152] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" HandleID="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.065 [INFO][4152] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" HandleID="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000102df0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-9999-0-3-4-20a459a426.novalocal", "pod":"calico-kube-controllers-5c6dcfd6c4-2svmz", "timestamp":"2025-03-21 13:32:49.044001084 +0000 UTC"}, Hostname:"ci-9999-0-3-4-20a459a426.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.065 [INFO][4152] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.065 [INFO][4152] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.066 [INFO][4152] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-4-20a459a426.novalocal' Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.069 [INFO][4152] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.080 [INFO][4152] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.096 [INFO][4152] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.101 [INFO][4152] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.107 [INFO][4152] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.108 [INFO][4152] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.110 [INFO][4152] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235 Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.118 [INFO][4152] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.127 [INFO][4152] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.68/26] block=192.168.91.64/26 handle="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.128 [INFO][4152] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.68/26] handle="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.128 [INFO][4152] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 13:32:49.178249 containerd[1480]: 2025-03-21 13:32:49.128 [INFO][4152] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.68/26] IPv6=[] ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" HandleID="k8s-pod-network.029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.180050 containerd[1480]: 2025-03-21 13:32:49.132 [INFO][4088] cni-plugin/k8s.go 386: Populated endpoint ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0", GenerateName:"calico-kube-controllers-5c6dcfd6c4-", Namespace:"calico-system", SelfLink:"", UID:"ac20d49b-f5e7-4afb-8b4c-03a33917614e", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6dcfd6c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"", Pod:"calico-kube-controllers-5c6dcfd6c4-2svmz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid7c4f8fa998", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:49.180050 containerd[1480]: 2025-03-21 13:32:49.132 [INFO][4088] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.68/32] ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.180050 containerd[1480]: 2025-03-21 13:32:49.132 [INFO][4088] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid7c4f8fa998 ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.180050 containerd[1480]: 2025-03-21 13:32:49.147 [INFO][4088] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.180050 containerd[1480]: 2025-03-21 13:32:49.154 [INFO][4088] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0", GenerateName:"calico-kube-controllers-5c6dcfd6c4-", Namespace:"calico-system", SelfLink:"", UID:"ac20d49b-f5e7-4afb-8b4c-03a33917614e", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c6dcfd6c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235", Pod:"calico-kube-controllers-5c6dcfd6c4-2svmz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid7c4f8fa998", MAC:"4e:bb:ba:b9:3f:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:49.180050 containerd[1480]: 2025-03-21 13:32:49.171 [INFO][4088] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" Namespace="calico-system" Pod="calico-kube-controllers-5c6dcfd6c4-2svmz" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--kube--controllers--5c6dcfd6c4--2svmz-eth0" Mar 21 13:32:49.188352 containerd[1480]: time="2025-03-21T13:32:49.187660868Z" level=info msg="Container 49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:49.222051 containerd[1480]: time="2025-03-21T13:32:49.221999679Z" level=info msg="CreateContainer within sandbox \"d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8\"" Mar 21 13:32:49.223467 containerd[1480]: time="2025-03-21T13:32:49.223384411Z" level=info msg="StartContainer for \"49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8\"" Mar 21 13:32:49.231327 containerd[1480]: time="2025-03-21T13:32:49.230857292Z" level=info msg="connecting to shim 49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8" address="unix:///run/containerd/s/c7d8a62dbee1284f51323eef94363fa818dd516c918fbee4f12a4e3314a2d82f" protocol=ttrpc version=3 Mar 21 13:32:49.244538 systemd-networkd[1380]: calid85965a55a6: Link UP Mar 21 13:32:49.245786 systemd-networkd[1380]: calid85965a55a6: Gained carrier Mar 21 13:32:49.259482 containerd[1480]: time="2025-03-21T13:32:49.258170052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-4qzd9,Uid:a6949731-d87b-46e5-9f9a-911cd33e8de2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425\"" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:48.959 [INFO][4096] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0 calico-apiserver-7d8f557dbd- calico-apiserver c18e3485-1786-4c2c-91d4-a50eb1c579da 731 0 2025-03-21 13:32:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d8f557dbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-9999-0-3-4-20a459a426.novalocal calico-apiserver-7d8f557dbd-b5fxs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid85965a55a6 [] []}} ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:48.959 [INFO][4096] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.073 [INFO][4156] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" HandleID="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.105 [INFO][4156] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" HandleID="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00058b7c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-9999-0-3-4-20a459a426.novalocal", "pod":"calico-apiserver-7d8f557dbd-b5fxs", "timestamp":"2025-03-21 13:32:49.07323248 +0000 UTC"}, Hostname:"ci-9999-0-3-4-20a459a426.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.105 [INFO][4156] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.128 [INFO][4156] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.128 [INFO][4156] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-4-20a459a426.novalocal' Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.133 [INFO][4156] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.158 [INFO][4156] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.174 [INFO][4156] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.180 [INFO][4156] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.185 [INFO][4156] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.186 [INFO][4156] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.189 [INFO][4156] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.198 [INFO][4156] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.215 [INFO][4156] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.69/26] block=192.168.91.64/26 handle="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.215 [INFO][4156] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.69/26] handle="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.215 [INFO][4156] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 13:32:49.278062 containerd[1480]: 2025-03-21 13:32:49.215 [INFO][4156] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.69/26] IPv6=[] ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" HandleID="k8s-pod-network.7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.278847 containerd[1480]: 2025-03-21 13:32:49.224 [INFO][4096] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0", GenerateName:"calico-apiserver-7d8f557dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"c18e3485-1786-4c2c-91d4-a50eb1c579da", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f557dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"", Pod:"calico-apiserver-7d8f557dbd-b5fxs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid85965a55a6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:49.278847 containerd[1480]: 2025-03-21 13:32:49.225 [INFO][4096] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.69/32] ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.278847 containerd[1480]: 2025-03-21 13:32:49.226 [INFO][4096] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid85965a55a6 ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.278847 containerd[1480]: 2025-03-21 13:32:49.246 [INFO][4096] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.278847 containerd[1480]: 2025-03-21 13:32:49.246 [INFO][4096] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0", GenerateName:"calico-apiserver-7d8f557dbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"c18e3485-1786-4c2c-91d4-a50eb1c579da", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d8f557dbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b", Pod:"calico-apiserver-7d8f557dbd-b5fxs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid85965a55a6", MAC:"26:cb:23:c7:7e:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:49.278847 containerd[1480]: 2025-03-21 13:32:49.273 [INFO][4096] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" Namespace="calico-apiserver" Pod="calico-apiserver-7d8f557dbd-b5fxs" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-calico--apiserver--7d8f557dbd--b5fxs-eth0" Mar 21 13:32:49.284812 systemd[1]: Started cri-containerd-49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8.scope - libcontainer container 49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8. Mar 21 13:32:49.302020 containerd[1480]: time="2025-03-21T13:32:49.300765622Z" level=info msg="connecting to shim 029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235" address="unix:///run/containerd/s/b5f93f31430803e555ed30d543bf47bbe76d93d829397218f40b5a47ee89a591" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:49.344646 systemd[1]: Started cri-containerd-029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235.scope - libcontainer container 029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235. Mar 21 13:32:49.373675 containerd[1480]: time="2025-03-21T13:32:49.373341117Z" level=info msg="StartContainer for \"49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8\" returns successfully" Mar 21 13:32:49.422667 containerd[1480]: time="2025-03-21T13:32:49.418310920Z" level=info msg="connecting to shim 7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b" address="unix:///run/containerd/s/7278634502201e663c46cfb14959383e1de05e1f98fcd4095c2d075865d42de4" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:49.466854 systemd[1]: Started cri-containerd-7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b.scope - libcontainer container 7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b. Mar 21 13:32:49.496885 kubelet[2781]: I0321 13:32:49.496735 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-pg92t" podStartSLOduration=39.496715247 podStartE2EDuration="39.496715247s" podCreationTimestamp="2025-03-21 13:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 13:32:49.489142624 +0000 UTC m=+53.415361055" watchObservedRunningTime="2025-03-21 13:32:49.496715247 +0000 UTC m=+53.422933628" Mar 21 13:32:49.497861 containerd[1480]: time="2025-03-21T13:32:49.497827815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c6dcfd6c4-2svmz,Uid:ac20d49b-f5e7-4afb-8b4c-03a33917614e,Namespace:calico-system,Attempt:0,} returns sandbox id \"029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235\"" Mar 21 13:32:49.552769 containerd[1480]: time="2025-03-21T13:32:49.552699657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d8f557dbd-b5fxs,Uid:c18e3485-1786-4c2c-91d4-a50eb1c579da,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b\"" Mar 21 13:32:50.066942 systemd-networkd[1380]: calibe4e569c5f6: Gained IPv6LL Mar 21 13:32:50.195077 systemd-networkd[1380]: cali6f92f3ea1f3: Gained IPv6LL Mar 21 13:32:50.323123 systemd-networkd[1380]: calid85965a55a6: Gained IPv6LL Mar 21 13:32:50.359374 containerd[1480]: time="2025-03-21T13:32:50.359329408Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:50.360312 containerd[1480]: time="2025-03-21T13:32:50.360252571Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 21 13:32:50.361719 containerd[1480]: time="2025-03-21T13:32:50.361671246Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:50.364140 containerd[1480]: time="2025-03-21T13:32:50.364076014Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:50.364877 containerd[1480]: time="2025-03-21T13:32:50.364840733Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.725720051s" Mar 21 13:32:50.365029 containerd[1480]: time="2025-03-21T13:32:50.364970673Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 21 13:32:50.366385 containerd[1480]: time="2025-03-21T13:32:50.366360702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 21 13:32:50.368715 containerd[1480]: time="2025-03-21T13:32:50.368548574Z" level=info msg="CreateContainer within sandbox \"6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 21 13:32:50.385489 containerd[1480]: time="2025-03-21T13:32:50.384581596Z" level=info msg="Container 9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:50.398404 containerd[1480]: time="2025-03-21T13:32:50.398370047Z" level=info msg="CreateContainer within sandbox \"6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef\"" Mar 21 13:32:50.399291 containerd[1480]: time="2025-03-21T13:32:50.399251220Z" level=info msg="StartContainer for \"9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef\"" Mar 21 13:32:50.401462 containerd[1480]: time="2025-03-21T13:32:50.401381170Z" level=info msg="connecting to shim 9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef" address="unix:///run/containerd/s/75e3e23cbda5f4319172c4131f3134357252d6101f681f373adf822f8c60bab4" protocol=ttrpc version=3 Mar 21 13:32:50.432610 systemd[1]: Started cri-containerd-9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef.scope - libcontainer container 9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef. Mar 21 13:32:50.585499 containerd[1480]: time="2025-03-21T13:32:50.585010080Z" level=info msg="StartContainer for \"9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef\" returns successfully" Mar 21 13:32:50.899323 systemd-networkd[1380]: calid7c4f8fa998: Gained IPv6LL Mar 21 13:32:51.191647 containerd[1480]: time="2025-03-21T13:32:51.190369126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nf9ts,Uid:bad9e428-aaad-4096-9b87-4071585b40fd,Namespace:kube-system,Attempt:0,}" Mar 21 13:32:51.367998 systemd-networkd[1380]: califea2c5d42b6: Link UP Mar 21 13:32:51.368527 systemd-networkd[1380]: califea2c5d42b6: Gained carrier Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.291 [INFO][4420] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0 coredns-7db6d8ff4d- kube-system bad9e428-aaad-4096-9b87-4071585b40fd 733 0 2025-03-21 13:32:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-9999-0-3-4-20a459a426.novalocal coredns-7db6d8ff4d-nf9ts eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califea2c5d42b6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.291 [INFO][4420] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.324 [INFO][4432] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" HandleID="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.335 [INFO][4432] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" HandleID="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290b20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-9999-0-3-4-20a459a426.novalocal", "pod":"coredns-7db6d8ff4d-nf9ts", "timestamp":"2025-03-21 13:32:51.324199203 +0000 UTC"}, Hostname:"ci-9999-0-3-4-20a459a426.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.335 [INFO][4432] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.335 [INFO][4432] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.335 [INFO][4432] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-9999-0-3-4-20a459a426.novalocal' Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.337 [INFO][4432] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.341 [INFO][4432] ipam/ipam.go 372: Looking up existing affinities for host host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.346 [INFO][4432] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.347 [INFO][4432] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.350 [INFO][4432] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.350 [INFO][4432] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.351 [INFO][4432] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1 Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.356 [INFO][4432] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.363 [INFO][4432] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.70/26] block=192.168.91.64/26 handle="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.364 [INFO][4432] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.70/26] handle="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" host="ci-9999-0-3-4-20a459a426.novalocal" Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.364 [INFO][4432] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 21 13:32:51.389933 containerd[1480]: 2025-03-21 13:32:51.364 [INFO][4432] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.70/26] IPv6=[] ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" HandleID="k8s-pod-network.c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Workload="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.393535 containerd[1480]: 2025-03-21 13:32:51.365 [INFO][4420] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"bad9e428-aaad-4096-9b87-4071585b40fd", ResourceVersion:"733", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-nf9ts", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califea2c5d42b6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:51.393535 containerd[1480]: 2025-03-21 13:32:51.365 [INFO][4420] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.70/32] ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.393535 containerd[1480]: 2025-03-21 13:32:51.365 [INFO][4420] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califea2c5d42b6 ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.393535 containerd[1480]: 2025-03-21 13:32:51.369 [INFO][4420] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.393535 containerd[1480]: 2025-03-21 13:32:51.369 [INFO][4420] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"bad9e428-aaad-4096-9b87-4071585b40fd", ResourceVersion:"733", Generation:0, CreationTimestamp:time.Date(2025, time.March, 21, 13, 32, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-9999-0-3-4-20a459a426.novalocal", ContainerID:"c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1", Pod:"coredns-7db6d8ff4d-nf9ts", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califea2c5d42b6", MAC:"12:f3:f3:de:d9:7c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 21 13:32:51.393535 containerd[1480]: 2025-03-21 13:32:51.385 [INFO][4420] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nf9ts" WorkloadEndpoint="ci--9999--0--3--4--20a459a426.novalocal-k8s-coredns--7db6d8ff4d--nf9ts-eth0" Mar 21 13:32:51.429981 containerd[1480]: time="2025-03-21T13:32:51.428612940Z" level=info msg="connecting to shim c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1" address="unix:///run/containerd/s/561f187f7bbbef861866ba3887827da888b334bee7ae8f3342b58bcbe63ff3d0" namespace=k8s.io protocol=ttrpc version=3 Mar 21 13:32:51.458863 systemd[1]: Started cri-containerd-c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1.scope - libcontainer container c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1. Mar 21 13:32:51.635878 containerd[1480]: time="2025-03-21T13:32:51.635781400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nf9ts,Uid:bad9e428-aaad-4096-9b87-4071585b40fd,Namespace:kube-system,Attempt:0,} returns sandbox id \"c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1\"" Mar 21 13:32:51.681795 containerd[1480]: time="2025-03-21T13:32:51.681533233Z" level=info msg="CreateContainer within sandbox \"c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 21 13:32:51.854547 containerd[1480]: time="2025-03-21T13:32:51.852560324Z" level=info msg="Container 742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:51.882742 containerd[1480]: time="2025-03-21T13:32:51.882658343Z" level=info msg="CreateContainer within sandbox \"c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e\"" Mar 21 13:32:51.885509 containerd[1480]: time="2025-03-21T13:32:51.883521059Z" level=info msg="StartContainer for \"742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e\"" Mar 21 13:32:51.888878 containerd[1480]: time="2025-03-21T13:32:51.888818135Z" level=info msg="connecting to shim 742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e" address="unix:///run/containerd/s/561f187f7bbbef861866ba3887827da888b334bee7ae8f3342b58bcbe63ff3d0" protocol=ttrpc version=3 Mar 21 13:32:51.919595 systemd[1]: Started cri-containerd-742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e.scope - libcontainer container 742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e. Mar 21 13:32:51.962711 containerd[1480]: time="2025-03-21T13:32:51.962665735Z" level=info msg="StartContainer for \"742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e\" returns successfully" Mar 21 13:32:52.520809 kubelet[2781]: I0321 13:32:52.520268 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-nf9ts" podStartSLOduration=42.520093039 podStartE2EDuration="42.520093039s" podCreationTimestamp="2025-03-21 13:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-21 13:32:52.519833411 +0000 UTC m=+56.446051842" watchObservedRunningTime="2025-03-21 13:32:52.520093039 +0000 UTC m=+56.446311430" Mar 21 13:32:53.074592 systemd-networkd[1380]: califea2c5d42b6: Gained IPv6LL Mar 21 13:32:57.101340 containerd[1480]: time="2025-03-21T13:32:57.101212357Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:57.102661 containerd[1480]: time="2025-03-21T13:32:57.102622034Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 21 13:32:57.103968 containerd[1480]: time="2025-03-21T13:32:57.103919947Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:57.106293 containerd[1480]: time="2025-03-21T13:32:57.106233061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:32:57.106982 containerd[1480]: time="2025-03-21T13:32:57.106945034Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 6.740550246s" Mar 21 13:32:57.107062 containerd[1480]: time="2025-03-21T13:32:57.106982625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 21 13:32:57.109639 containerd[1480]: time="2025-03-21T13:32:57.108756529Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 21 13:32:57.110950 containerd[1480]: time="2025-03-21T13:32:57.110916381Z" level=info msg="CreateContainer within sandbox \"96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 21 13:32:57.124614 containerd[1480]: time="2025-03-21T13:32:57.124577623Z" level=info msg="Container 27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:32:57.136034 containerd[1480]: time="2025-03-21T13:32:57.135933746Z" level=info msg="CreateContainer within sandbox \"96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29\"" Mar 21 13:32:57.138003 containerd[1480]: time="2025-03-21T13:32:57.136623956Z" level=info msg="StartContainer for \"27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29\"" Mar 21 13:32:57.138003 containerd[1480]: time="2025-03-21T13:32:57.137742857Z" level=info msg="connecting to shim 27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29" address="unix:///run/containerd/s/5a9db7f1094eef4ba880999cc3a254fd4c221c22d1aa27f25eafc97d227b585e" protocol=ttrpc version=3 Mar 21 13:32:57.162714 systemd[1]: Started cri-containerd-27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29.scope - libcontainer container 27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29. Mar 21 13:32:57.233147 containerd[1480]: time="2025-03-21T13:32:57.232548770Z" level=info msg="StartContainer for \"27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29\" returns successfully" Mar 21 13:32:59.150689 kubelet[2781]: I0321 13:32:59.150473 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d8f557dbd-4qzd9" podStartSLOduration=33.310461472 podStartE2EDuration="41.150452238s" podCreationTimestamp="2025-03-21 13:32:18 +0000 UTC" firstStartedPulling="2025-03-21 13:32:49.268536805 +0000 UTC m=+53.194755196" lastFinishedPulling="2025-03-21 13:32:57.108527561 +0000 UTC m=+61.034745962" observedRunningTime="2025-03-21 13:32:57.500387647 +0000 UTC m=+61.426606028" watchObservedRunningTime="2025-03-21 13:32:59.150452238 +0000 UTC m=+63.076670619" Mar 21 13:33:02.299728 containerd[1480]: time="2025-03-21T13:33:02.299674870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:02.300959 containerd[1480]: time="2025-03-21T13:33:02.300893336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 21 13:33:02.302187 containerd[1480]: time="2025-03-21T13:33:02.302131869Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:02.304844 containerd[1480]: time="2025-03-21T13:33:02.304794431Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:02.305823 containerd[1480]: time="2025-03-21T13:33:02.305344290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 5.196557764s" Mar 21 13:33:02.305823 containerd[1480]: time="2025-03-21T13:33:02.305417790Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 21 13:33:02.308195 containerd[1480]: time="2025-03-21T13:33:02.307974229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 21 13:33:02.324077 containerd[1480]: time="2025-03-21T13:33:02.323190628Z" level=info msg="CreateContainer within sandbox \"029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 21 13:33:02.337704 containerd[1480]: time="2025-03-21T13:33:02.337602170Z" level=info msg="Container 2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:33:02.351403 containerd[1480]: time="2025-03-21T13:33:02.351356608Z" level=info msg="CreateContainer within sandbox \"029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\"" Mar 21 13:33:02.352709 containerd[1480]: time="2025-03-21T13:33:02.352586145Z" level=info msg="StartContainer for \"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\"" Mar 21 13:33:02.354262 containerd[1480]: time="2025-03-21T13:33:02.354221015Z" level=info msg="connecting to shim 2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b" address="unix:///run/containerd/s/b5f93f31430803e555ed30d543bf47bbe76d93d829397218f40b5a47ee89a591" protocol=ttrpc version=3 Mar 21 13:33:02.378598 systemd[1]: Started cri-containerd-2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b.scope - libcontainer container 2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b. Mar 21 13:33:02.432494 containerd[1480]: time="2025-03-21T13:33:02.431640180Z" level=info msg="StartContainer for \"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" returns successfully" Mar 21 13:33:02.555060 kubelet[2781]: I0321 13:33:02.554831 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5c6dcfd6c4-2svmz" podStartSLOduration=30.74806633 podStartE2EDuration="43.554814432s" podCreationTimestamp="2025-03-21 13:32:19 +0000 UTC" firstStartedPulling="2025-03-21 13:32:49.499814673 +0000 UTC m=+53.426033054" lastFinishedPulling="2025-03-21 13:33:02.306562775 +0000 UTC m=+66.232781156" observedRunningTime="2025-03-21 13:33:02.552253765 +0000 UTC m=+66.478472146" watchObservedRunningTime="2025-03-21 13:33:02.554814432 +0000 UTC m=+66.481032813" Mar 21 13:33:02.678683 containerd[1480]: time="2025-03-21T13:33:02.678565535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"08e8fc661ba8fa00a81f7f8d0d0a0746b6f9b7dfbc8a1a0fbe1aba9ec407f439\" pid:4644 exited_at:{seconds:1742563982 nanos:677577189}" Mar 21 13:33:02.778041 containerd[1480]: time="2025-03-21T13:33:02.777654213Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:02.780336 containerd[1480]: time="2025-03-21T13:33:02.780284904Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 21 13:33:02.782838 containerd[1480]: time="2025-03-21T13:33:02.782755478Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 474.753367ms" Mar 21 13:33:02.782838 containerd[1480]: time="2025-03-21T13:33:02.782802598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 21 13:33:02.785491 containerd[1480]: time="2025-03-21T13:33:02.784983300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 21 13:33:02.787168 containerd[1480]: time="2025-03-21T13:33:02.786986012Z" level=info msg="CreateContainer within sandbox \"7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 21 13:33:02.801638 containerd[1480]: time="2025-03-21T13:33:02.801601624Z" level=info msg="Container 2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:33:02.817317 containerd[1480]: time="2025-03-21T13:33:02.817117865Z" level=info msg="CreateContainer within sandbox \"7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd\"" Mar 21 13:33:02.818378 containerd[1480]: time="2025-03-21T13:33:02.818167899Z" level=info msg="StartContainer for \"2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd\"" Mar 21 13:33:02.819593 containerd[1480]: time="2025-03-21T13:33:02.819409618Z" level=info msg="connecting to shim 2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd" address="unix:///run/containerd/s/7278634502201e663c46cfb14959383e1de05e1f98fcd4095c2d075865d42de4" protocol=ttrpc version=3 Mar 21 13:33:02.849850 systemd[1]: Started cri-containerd-2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd.scope - libcontainer container 2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd. Mar 21 13:33:03.015162 containerd[1480]: time="2025-03-21T13:33:03.014993434Z" level=info msg="StartContainer for \"2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd\" returns successfully" Mar 21 13:33:04.524933 kubelet[2781]: I0321 13:33:04.524705 2781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 13:33:05.285236 containerd[1480]: time="2025-03-21T13:33:05.285182227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"d05646987dd240064200d0f21c75400f907fa2d6c77d56703f050a1ae51dac48\" pid:4704 exited_at:{seconds:1742563985 nanos:283697446}" Mar 21 13:33:05.352597 containerd[1480]: time="2025-03-21T13:33:05.352556951Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:05.353596 containerd[1480]: time="2025-03-21T13:33:05.353556968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 21 13:33:05.354633 containerd[1480]: time="2025-03-21T13:33:05.354585708Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:05.357163 containerd[1480]: time="2025-03-21T13:33:05.357141089Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 21 13:33:05.358099 containerd[1480]: time="2025-03-21T13:33:05.357785728Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.572633125s" Mar 21 13:33:05.358099 containerd[1480]: time="2025-03-21T13:33:05.357825894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 21 13:33:05.360403 containerd[1480]: time="2025-03-21T13:33:05.360360245Z" level=info msg="CreateContainer within sandbox \"6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 21 13:33:05.385465 containerd[1480]: time="2025-03-21T13:33:05.382095743Z" level=info msg="Container 3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a: CDI devices from CRI Config.CDIDevices: []" Mar 21 13:33:05.401050 containerd[1480]: time="2025-03-21T13:33:05.400985607Z" level=info msg="CreateContainer within sandbox \"6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a\"" Mar 21 13:33:05.403501 containerd[1480]: time="2025-03-21T13:33:05.401690440Z" level=info msg="StartContainer for \"3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a\"" Mar 21 13:33:05.403635 containerd[1480]: time="2025-03-21T13:33:05.403531899Z" level=info msg="connecting to shim 3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a" address="unix:///run/containerd/s/75e3e23cbda5f4319172c4131f3134357252d6101f681f373adf822f8c60bab4" protocol=ttrpc version=3 Mar 21 13:33:05.434631 systemd[1]: Started cri-containerd-3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a.scope - libcontainer container 3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a. Mar 21 13:33:05.492395 containerd[1480]: time="2025-03-21T13:33:05.492337913Z" level=info msg="StartContainer for \"3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a\" returns successfully" Mar 21 13:33:05.550379 kubelet[2781]: I0321 13:33:05.550047 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hh6d2" podStartSLOduration=28.830082449 podStartE2EDuration="46.550030834s" podCreationTimestamp="2025-03-21 13:32:19 +0000 UTC" firstStartedPulling="2025-03-21 13:32:47.638892914 +0000 UTC m=+51.565111305" lastFinishedPulling="2025-03-21 13:33:05.358841289 +0000 UTC m=+69.285059690" observedRunningTime="2025-03-21 13:33:05.548855083 +0000 UTC m=+69.475073474" watchObservedRunningTime="2025-03-21 13:33:05.550030834 +0000 UTC m=+69.476249225" Mar 21 13:33:05.551420 kubelet[2781]: I0321 13:33:05.550407 2781 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d8f557dbd-b5fxs" podStartSLOduration=34.321138662 podStartE2EDuration="47.550399677s" podCreationTimestamp="2025-03-21 13:32:18 +0000 UTC" firstStartedPulling="2025-03-21 13:32:49.554849329 +0000 UTC m=+53.481067710" lastFinishedPulling="2025-03-21 13:33:02.784110344 +0000 UTC m=+66.710328725" observedRunningTime="2025-03-21 13:33:03.538178726 +0000 UTC m=+67.464397188" watchObservedRunningTime="2025-03-21 13:33:05.550399677 +0000 UTC m=+69.476618058" Mar 21 13:33:06.074013 containerd[1480]: time="2025-03-21T13:33:06.073926317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"d8d725f58a9ae49c1748b9a1479b1b98c823c5891fb896dbf6b16e6605e722c8\" pid:4758 exited_at:{seconds:1742563986 nanos:73269355}" Mar 21 13:33:06.312561 kubelet[2781]: I0321 13:33:06.312507 2781 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 21 13:33:06.312561 kubelet[2781]: I0321 13:33:06.312576 2781 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 21 13:33:08.951514 kubelet[2781]: I0321 13:33:08.950268 2781 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 21 13:33:35.328788 containerd[1480]: time="2025-03-21T13:33:35.328431843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"3dc77ce813dc7f4094beeb4242c378223a5783cc06c18c1cf2e7e2bc1c41caf6\" pid:4810 exited_at:{seconds:1742564015 nanos:327866995}" Mar 21 13:33:36.066402 containerd[1480]: time="2025-03-21T13:33:36.066154983Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"22f35333008ceb98e532db636692bf3f0c2ed0cd3043be43c2a6d73dfbe6dd2d\" pid:4831 exited_at:{seconds:1742564016 nanos:65518519}" Mar 21 13:33:53.031537 systemd[1]: Started sshd@7-172.24.4.107:22-172.24.4.1:46762.service - OpenSSH per-connection server daemon (172.24.4.1:46762). Mar 21 13:33:54.262984 sshd[4850]: Accepted publickey for core from 172.24.4.1 port 46762 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:33:54.266166 sshd-session[4850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:33:54.285558 systemd-logind[1460]: New session 10 of user core. Mar 21 13:33:54.290850 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 21 13:33:55.088977 sshd[4852]: Connection closed by 172.24.4.1 port 46762 Mar 21 13:33:55.090098 sshd-session[4850]: pam_unix(sshd:session): session closed for user core Mar 21 13:33:55.097650 systemd-logind[1460]: Session 10 logged out. Waiting for processes to exit. Mar 21 13:33:55.099269 systemd[1]: sshd@7-172.24.4.107:22-172.24.4.1:46762.service: Deactivated successfully. Mar 21 13:33:55.105398 systemd[1]: session-10.scope: Deactivated successfully. Mar 21 13:33:55.108561 systemd-logind[1460]: Removed session 10. Mar 21 13:34:00.113659 systemd[1]: Started sshd@8-172.24.4.107:22-172.24.4.1:39488.service - OpenSSH per-connection server daemon (172.24.4.1:39488). Mar 21 13:34:01.457110 sshd[4867]: Accepted publickey for core from 172.24.4.1 port 39488 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:01.460113 sshd-session[4867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:01.471428 systemd-logind[1460]: New session 11 of user core. Mar 21 13:34:01.479764 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 21 13:34:02.229577 sshd[4869]: Connection closed by 172.24.4.1 port 39488 Mar 21 13:34:02.230336 sshd-session[4867]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:02.234565 systemd-logind[1460]: Session 11 logged out. Waiting for processes to exit. Mar 21 13:34:02.234655 systemd[1]: sshd@8-172.24.4.107:22-172.24.4.1:39488.service: Deactivated successfully. Mar 21 13:34:02.238319 systemd[1]: session-11.scope: Deactivated successfully. Mar 21 13:34:02.243537 systemd-logind[1460]: Removed session 11. Mar 21 13:34:05.312909 containerd[1480]: time="2025-03-21T13:34:05.312851999Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"50a2c0aa051fe5473090d3eb5e9756a590556daf40859561f5c635fb90df83b9\" pid:4893 exited_at:{seconds:1742564045 nanos:312306110}" Mar 21 13:34:06.071036 containerd[1480]: time="2025-03-21T13:34:06.070981354Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"043c59043bf3113896319fd6906effcfa9bdbba4c1af734f384b5ee085431777\" pid:4915 exited_at:{seconds:1742564046 nanos:70577202}" Mar 21 13:34:07.250898 systemd[1]: Started sshd@9-172.24.4.107:22-172.24.4.1:54760.service - OpenSSH per-connection server daemon (172.24.4.1:54760). Mar 21 13:34:07.854412 containerd[1480]: time="2025-03-21T13:34:07.854344666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"e2d4e697d406bb13cde54333b6e5f4a6fe38061ec7712a7b287114584bf81529\" pid:4949 exited_at:{seconds:1742564047 nanos:854104183}" Mar 21 13:34:08.500458 sshd[4934]: Accepted publickey for core from 172.24.4.1 port 54760 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:08.502815 sshd-session[4934]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:08.512415 systemd-logind[1460]: New session 12 of user core. Mar 21 13:34:08.519853 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 21 13:34:09.229122 sshd[4958]: Connection closed by 172.24.4.1 port 54760 Mar 21 13:34:09.229788 sshd-session[4934]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:09.244732 systemd[1]: sshd@9-172.24.4.107:22-172.24.4.1:54760.service: Deactivated successfully. Mar 21 13:34:09.251923 systemd[1]: session-12.scope: Deactivated successfully. Mar 21 13:34:09.254201 systemd-logind[1460]: Session 12 logged out. Waiting for processes to exit. Mar 21 13:34:09.259073 systemd[1]: Started sshd@10-172.24.4.107:22-172.24.4.1:54776.service - OpenSSH per-connection server daemon (172.24.4.1:54776). Mar 21 13:34:09.260327 systemd-logind[1460]: Removed session 12. Mar 21 13:34:10.693392 sshd[4970]: Accepted publickey for core from 172.24.4.1 port 54776 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:10.696776 sshd-session[4970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:10.711509 systemd-logind[1460]: New session 13 of user core. Mar 21 13:34:10.722741 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 21 13:34:11.654360 sshd[4973]: Connection closed by 172.24.4.1 port 54776 Mar 21 13:34:11.654186 sshd-session[4970]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:11.672866 systemd[1]: sshd@10-172.24.4.107:22-172.24.4.1:54776.service: Deactivated successfully. Mar 21 13:34:11.676975 systemd[1]: session-13.scope: Deactivated successfully. Mar 21 13:34:11.679233 systemd-logind[1460]: Session 13 logged out. Waiting for processes to exit. Mar 21 13:34:11.684335 systemd[1]: Started sshd@11-172.24.4.107:22-172.24.4.1:54790.service - OpenSSH per-connection server daemon (172.24.4.1:54790). Mar 21 13:34:11.689199 systemd-logind[1460]: Removed session 13. Mar 21 13:34:13.020309 sshd[4987]: Accepted publickey for core from 172.24.4.1 port 54790 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:13.023085 sshd-session[4987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:13.035247 systemd-logind[1460]: New session 14 of user core. Mar 21 13:34:13.046752 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 21 13:34:13.772258 sshd[4990]: Connection closed by 172.24.4.1 port 54790 Mar 21 13:34:13.773419 sshd-session[4987]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:13.779738 systemd[1]: sshd@11-172.24.4.107:22-172.24.4.1:54790.service: Deactivated successfully. Mar 21 13:34:13.784015 systemd[1]: session-14.scope: Deactivated successfully. Mar 21 13:34:13.785359 systemd-logind[1460]: Session 14 logged out. Waiting for processes to exit. Mar 21 13:34:13.786684 systemd-logind[1460]: Removed session 14. Mar 21 13:34:18.797075 systemd[1]: Started sshd@12-172.24.4.107:22-172.24.4.1:46842.service - OpenSSH per-connection server daemon (172.24.4.1:46842). Mar 21 13:34:20.061424 sshd[5018]: Accepted publickey for core from 172.24.4.1 port 46842 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:20.064261 sshd-session[5018]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:20.079816 systemd-logind[1460]: New session 15 of user core. Mar 21 13:34:20.089854 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 21 13:34:20.927189 sshd[5020]: Connection closed by 172.24.4.1 port 46842 Mar 21 13:34:20.928513 sshd-session[5018]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:20.936719 systemd[1]: sshd@12-172.24.4.107:22-172.24.4.1:46842.service: Deactivated successfully. Mar 21 13:34:20.941761 systemd[1]: session-15.scope: Deactivated successfully. Mar 21 13:34:20.943807 systemd-logind[1460]: Session 15 logged out. Waiting for processes to exit. Mar 21 13:34:20.946608 systemd-logind[1460]: Removed session 15. Mar 21 13:34:25.950313 systemd[1]: Started sshd@13-172.24.4.107:22-172.24.4.1:51136.service - OpenSSH per-connection server daemon (172.24.4.1:51136). Mar 21 13:34:27.256008 sshd[5036]: Accepted publickey for core from 172.24.4.1 port 51136 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:27.258846 sshd-session[5036]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:27.271597 systemd-logind[1460]: New session 16 of user core. Mar 21 13:34:27.278750 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 21 13:34:28.023594 sshd[5038]: Connection closed by 172.24.4.1 port 51136 Mar 21 13:34:28.023897 sshd-session[5036]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:28.029568 systemd-logind[1460]: Session 16 logged out. Waiting for processes to exit. Mar 21 13:34:28.030250 systemd[1]: sshd@13-172.24.4.107:22-172.24.4.1:51136.service: Deactivated successfully. Mar 21 13:34:28.033076 systemd[1]: session-16.scope: Deactivated successfully. Mar 21 13:34:28.034989 systemd-logind[1460]: Removed session 16. Mar 21 13:34:33.043989 systemd[1]: Started sshd@14-172.24.4.107:22-172.24.4.1:51140.service - OpenSSH per-connection server daemon (172.24.4.1:51140). Mar 21 13:34:34.254084 sshd[5050]: Accepted publickey for core from 172.24.4.1 port 51140 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:34.256920 sshd-session[5050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:34.267894 systemd-logind[1460]: New session 17 of user core. Mar 21 13:34:34.279746 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 21 13:34:35.023673 sshd[5052]: Connection closed by 172.24.4.1 port 51140 Mar 21 13:34:35.023818 sshd-session[5050]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:35.031795 systemd[1]: sshd@14-172.24.4.107:22-172.24.4.1:51140.service: Deactivated successfully. Mar 21 13:34:35.035410 systemd[1]: session-17.scope: Deactivated successfully. Mar 21 13:34:35.039185 systemd-logind[1460]: Session 17 logged out. Waiting for processes to exit. Mar 21 13:34:35.040540 systemd-logind[1460]: Removed session 17. Mar 21 13:34:35.274757 containerd[1480]: time="2025-03-21T13:34:35.274550910Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"163f3808b04b6496ce4861146a027e06be4f3c5ca29e8bfa031201096e077727\" pid:5075 exited_at:{seconds:1742564075 nanos:272743275}" Mar 21 13:34:36.045660 containerd[1480]: time="2025-03-21T13:34:36.045556326Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"2df511aa0cae7dd1ed394d27078d92fead69c3b401527bb46e42a00733b2a3ce\" pid:5098 exited_at:{seconds:1742564076 nanos:45219842}" Mar 21 13:34:40.048386 systemd[1]: Started sshd@15-172.24.4.107:22-172.24.4.1:39078.service - OpenSSH per-connection server daemon (172.24.4.1:39078). Mar 21 13:34:41.254566 sshd[5111]: Accepted publickey for core from 172.24.4.1 port 39078 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:41.257565 sshd-session[5111]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:41.269885 systemd-logind[1460]: New session 18 of user core. Mar 21 13:34:41.279761 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 21 13:34:42.023132 sshd[5115]: Connection closed by 172.24.4.1 port 39078 Mar 21 13:34:42.024269 sshd-session[5111]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:42.030153 systemd[1]: sshd@15-172.24.4.107:22-172.24.4.1:39078.service: Deactivated successfully. Mar 21 13:34:42.032778 systemd[1]: session-18.scope: Deactivated successfully. Mar 21 13:34:42.034832 systemd-logind[1460]: Session 18 logged out. Waiting for processes to exit. Mar 21 13:34:42.036318 systemd-logind[1460]: Removed session 18. Mar 21 13:34:47.049575 systemd[1]: Started sshd@16-172.24.4.107:22-172.24.4.1:57998.service - OpenSSH per-connection server daemon (172.24.4.1:57998). Mar 21 13:34:48.434491 sshd[5128]: Accepted publickey for core from 172.24.4.1 port 57998 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:48.437209 sshd-session[5128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:48.450652 systemd-logind[1460]: New session 19 of user core. Mar 21 13:34:48.453794 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 21 13:34:49.181565 sshd[5130]: Connection closed by 172.24.4.1 port 57998 Mar 21 13:34:49.182400 sshd-session[5128]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:49.185477 systemd[1]: sshd@16-172.24.4.107:22-172.24.4.1:57998.service: Deactivated successfully. Mar 21 13:34:49.187298 systemd[1]: session-19.scope: Deactivated successfully. Mar 21 13:34:49.189991 systemd-logind[1460]: Session 19 logged out. Waiting for processes to exit. Mar 21 13:34:49.191290 systemd-logind[1460]: Removed session 19. Mar 21 13:34:54.210914 systemd[1]: Started sshd@17-172.24.4.107:22-172.24.4.1:56696.service - OpenSSH per-connection server daemon (172.24.4.1:56696). Mar 21 13:34:55.346145 sshd[5143]: Accepted publickey for core from 172.24.4.1 port 56696 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:34:55.349008 sshd-session[5143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:34:55.359791 systemd-logind[1460]: New session 20 of user core. Mar 21 13:34:55.367798 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 21 13:34:56.094565 sshd[5145]: Connection closed by 172.24.4.1 port 56696 Mar 21 13:34:56.093967 sshd-session[5143]: pam_unix(sshd:session): session closed for user core Mar 21 13:34:56.099603 systemd[1]: sshd@17-172.24.4.107:22-172.24.4.1:56696.service: Deactivated successfully. Mar 21 13:34:56.101596 systemd[1]: session-20.scope: Deactivated successfully. Mar 21 13:34:56.103872 systemd-logind[1460]: Session 20 logged out. Waiting for processes to exit. Mar 21 13:34:56.105788 systemd-logind[1460]: Removed session 20. Mar 21 13:35:01.124260 systemd[1]: Started sshd@18-172.24.4.107:22-172.24.4.1:56702.service - OpenSSH per-connection server daemon (172.24.4.1:56702). Mar 21 13:35:02.258139 sshd[5159]: Accepted publickey for core from 172.24.4.1 port 56702 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:02.261087 sshd-session[5159]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:02.272650 systemd-logind[1460]: New session 21 of user core. Mar 21 13:35:02.281740 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 21 13:35:03.023518 sshd[5161]: Connection closed by 172.24.4.1 port 56702 Mar 21 13:35:03.024063 sshd-session[5159]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:03.030182 systemd[1]: sshd@18-172.24.4.107:22-172.24.4.1:56702.service: Deactivated successfully. Mar 21 13:35:03.035010 systemd[1]: session-21.scope: Deactivated successfully. Mar 21 13:35:03.037510 systemd-logind[1460]: Session 21 logged out. Waiting for processes to exit. Mar 21 13:35:03.038796 systemd-logind[1460]: Removed session 21. Mar 21 13:35:05.314044 containerd[1480]: time="2025-03-21T13:35:05.313997102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"0c26d5ac8a769f221452cfef2ab2f4716c462564019bd850e8e2b6df572e9fb2\" pid:5184 exited_at:{seconds:1742564105 nanos:313489731}" Mar 21 13:35:06.057113 containerd[1480]: time="2025-03-21T13:35:06.056665755Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"b1dc9afba5a5f26814675e15774180cc741e259afde6a64651b1454cf67e07ff\" pid:5205 exited_at:{seconds:1742564106 nanos:56288344}" Mar 21 13:35:07.852117 containerd[1480]: time="2025-03-21T13:35:07.851808045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"794dda03ee43ba38d3dcafc5a33374fd110427789a9802eae7fd26816623591f\" pid:5230 exited_at:{seconds:1742564107 nanos:850525363}" Mar 21 13:35:08.035277 systemd[1]: Started sshd@19-172.24.4.107:22-172.24.4.1:45506.service - OpenSSH per-connection server daemon (172.24.4.1:45506). Mar 21 13:35:09.398961 sshd[5240]: Accepted publickey for core from 172.24.4.1 port 45506 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:09.402667 sshd-session[5240]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:09.416181 systemd-logind[1460]: New session 22 of user core. Mar 21 13:35:09.422774 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 21 13:35:09.990633 sshd[5242]: Connection closed by 172.24.4.1 port 45506 Mar 21 13:35:09.991580 sshd-session[5240]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:09.996515 systemd[1]: sshd@19-172.24.4.107:22-172.24.4.1:45506.service: Deactivated successfully. Mar 21 13:35:09.998534 systemd[1]: session-22.scope: Deactivated successfully. Mar 21 13:35:09.999986 systemd-logind[1460]: Session 22 logged out. Waiting for processes to exit. Mar 21 13:35:10.001419 systemd-logind[1460]: Removed session 22. Mar 21 13:35:15.012411 systemd[1]: Started sshd@20-172.24.4.107:22-172.24.4.1:40460.service - OpenSSH per-connection server daemon (172.24.4.1:40460). Mar 21 13:35:16.330119 sshd[5255]: Accepted publickey for core from 172.24.4.1 port 40460 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:16.332161 sshd-session[5255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:16.352585 systemd-logind[1460]: New session 23 of user core. Mar 21 13:35:16.358868 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 21 13:35:17.165802 sshd[5257]: Connection closed by 172.24.4.1 port 40460 Mar 21 13:35:17.166377 sshd-session[5255]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:17.169408 systemd-logind[1460]: Session 23 logged out. Waiting for processes to exit. Mar 21 13:35:17.170018 systemd[1]: sshd@20-172.24.4.107:22-172.24.4.1:40460.service: Deactivated successfully. Mar 21 13:35:17.172117 systemd[1]: session-23.scope: Deactivated successfully. Mar 21 13:35:17.174976 systemd-logind[1460]: Removed session 23. Mar 21 13:35:22.187113 systemd[1]: Started sshd@21-172.24.4.107:22-172.24.4.1:40474.service - OpenSSH per-connection server daemon (172.24.4.1:40474). Mar 21 13:35:23.503640 sshd[5272]: Accepted publickey for core from 172.24.4.1 port 40474 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:23.506121 sshd-session[5272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:23.519549 systemd-logind[1460]: New session 24 of user core. Mar 21 13:35:23.525161 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 21 13:35:24.132800 sshd[5274]: Connection closed by 172.24.4.1 port 40474 Mar 21 13:35:24.132678 sshd-session[5272]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:24.138844 systemd[1]: sshd@21-172.24.4.107:22-172.24.4.1:40474.service: Deactivated successfully. Mar 21 13:35:24.142471 systemd[1]: session-24.scope: Deactivated successfully. Mar 21 13:35:24.144066 systemd-logind[1460]: Session 24 logged out. Waiting for processes to exit. Mar 21 13:35:24.145982 systemd-logind[1460]: Removed session 24. Mar 21 13:35:29.158299 systemd[1]: Started sshd@22-172.24.4.107:22-172.24.4.1:42778.service - OpenSSH per-connection server daemon (172.24.4.1:42778). Mar 21 13:35:30.374186 sshd[5292]: Accepted publickey for core from 172.24.4.1 port 42778 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:30.376787 sshd-session[5292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:30.388679 systemd-logind[1460]: New session 25 of user core. Mar 21 13:35:30.394760 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 21 13:35:31.133237 sshd[5294]: Connection closed by 172.24.4.1 port 42778 Mar 21 13:35:31.134492 sshd-session[5292]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:31.141208 systemd[1]: sshd@22-172.24.4.107:22-172.24.4.1:42778.service: Deactivated successfully. Mar 21 13:35:31.145289 systemd[1]: session-25.scope: Deactivated successfully. Mar 21 13:35:31.149768 systemd-logind[1460]: Session 25 logged out. Waiting for processes to exit. Mar 21 13:35:31.152547 systemd-logind[1460]: Removed session 25. Mar 21 13:35:35.307686 containerd[1480]: time="2025-03-21T13:35:35.307511471Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"8bc9701e23a28bb3a6d49c1524592eaea697936b5ddf3a8c5adf67b76803f394\" pid:5320 exited_at:{seconds:1742564135 nanos:306176862}" Mar 21 13:35:36.070250 containerd[1480]: time="2025-03-21T13:35:36.070205337Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"a06dce1081056e2cb8b9a0e346b0d290959d7916e679801427a806abc40c123d\" pid:5341 exited_at:{seconds:1742564136 nanos:69897462}" Mar 21 13:35:36.147174 systemd[1]: Started sshd@23-172.24.4.107:22-172.24.4.1:37976.service - OpenSSH per-connection server daemon (172.24.4.1:37976). Mar 21 13:35:37.321085 sshd[5353]: Accepted publickey for core from 172.24.4.1 port 37976 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:37.324877 sshd-session[5353]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:37.338406 systemd-logind[1460]: New session 26 of user core. Mar 21 13:35:37.345848 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 21 13:35:38.150530 sshd[5355]: Connection closed by 172.24.4.1 port 37976 Mar 21 13:35:38.151676 sshd-session[5353]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:38.158226 systemd[1]: sshd@23-172.24.4.107:22-172.24.4.1:37976.service: Deactivated successfully. Mar 21 13:35:38.162991 systemd[1]: session-26.scope: Deactivated successfully. Mar 21 13:35:38.166260 systemd-logind[1460]: Session 26 logged out. Waiting for processes to exit. Mar 21 13:35:38.169034 systemd-logind[1460]: Removed session 26. Mar 21 13:35:43.170020 systemd[1]: Started sshd@24-172.24.4.107:22-172.24.4.1:37980.service - OpenSSH per-connection server daemon (172.24.4.1:37980). Mar 21 13:35:44.368947 sshd[5368]: Accepted publickey for core from 172.24.4.1 port 37980 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:44.371644 sshd-session[5368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:44.382423 systemd-logind[1460]: New session 27 of user core. Mar 21 13:35:44.394749 systemd[1]: Started session-27.scope - Session 27 of User core. Mar 21 13:35:45.132177 sshd[5370]: Connection closed by 172.24.4.1 port 37980 Mar 21 13:35:45.133118 sshd-session[5368]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:45.139238 systemd-logind[1460]: Session 27 logged out. Waiting for processes to exit. Mar 21 13:35:45.139767 systemd[1]: sshd@24-172.24.4.107:22-172.24.4.1:37980.service: Deactivated successfully. Mar 21 13:35:45.143227 systemd[1]: session-27.scope: Deactivated successfully. Mar 21 13:35:45.146506 systemd-logind[1460]: Removed session 27. Mar 21 13:35:50.153001 systemd[1]: Started sshd@25-172.24.4.107:22-172.24.4.1:57626.service - OpenSSH per-connection server daemon (172.24.4.1:57626). Mar 21 13:35:51.514317 sshd[5383]: Accepted publickey for core from 172.24.4.1 port 57626 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:51.517172 sshd-session[5383]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:51.527983 systemd-logind[1460]: New session 28 of user core. Mar 21 13:35:51.533765 systemd[1]: Started session-28.scope - Session 28 of User core. Mar 21 13:35:52.273386 sshd[5397]: Connection closed by 172.24.4.1 port 57626 Mar 21 13:35:52.274311 sshd-session[5383]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:52.278368 systemd[1]: sshd@25-172.24.4.107:22-172.24.4.1:57626.service: Deactivated successfully. Mar 21 13:35:52.280351 systemd[1]: session-28.scope: Deactivated successfully. Mar 21 13:35:52.281408 systemd-logind[1460]: Session 28 logged out. Waiting for processes to exit. Mar 21 13:35:52.282543 systemd-logind[1460]: Removed session 28. Mar 21 13:35:57.293508 systemd[1]: Started sshd@26-172.24.4.107:22-172.24.4.1:49486.service - OpenSSH per-connection server daemon (172.24.4.1:49486). Mar 21 13:35:58.503285 sshd[5416]: Accepted publickey for core from 172.24.4.1 port 49486 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:35:58.505591 sshd-session[5416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:35:58.514523 systemd-logind[1460]: New session 29 of user core. Mar 21 13:35:58.519681 systemd[1]: Started session-29.scope - Session 29 of User core. Mar 21 13:35:59.117657 sshd[5419]: Connection closed by 172.24.4.1 port 49486 Mar 21 13:35:59.118806 sshd-session[5416]: pam_unix(sshd:session): session closed for user core Mar 21 13:35:59.124068 systemd[1]: sshd@26-172.24.4.107:22-172.24.4.1:49486.service: Deactivated successfully. Mar 21 13:35:59.126524 systemd[1]: session-29.scope: Deactivated successfully. Mar 21 13:35:59.127894 systemd-logind[1460]: Session 29 logged out. Waiting for processes to exit. Mar 21 13:35:59.129703 systemd-logind[1460]: Removed session 29. Mar 21 13:36:04.143813 systemd[1]: Started sshd@27-172.24.4.107:22-172.24.4.1:55084.service - OpenSSH per-connection server daemon (172.24.4.1:55084). Mar 21 13:36:05.310182 containerd[1480]: time="2025-03-21T13:36:05.310141145Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"c8a0b0ee763230daef42aa9554123e4d6b808beebff6ff71f2cceaa7a26fbaf5\" pid:5447 exited_at:{seconds:1742564165 nanos:309844649}" Mar 21 13:36:05.503546 sshd[5431]: Accepted publickey for core from 172.24.4.1 port 55084 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:05.504617 sshd-session[5431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:05.514420 systemd-logind[1460]: New session 30 of user core. Mar 21 13:36:05.523733 systemd[1]: Started session-30.scope - Session 30 of User core. Mar 21 13:36:06.069401 containerd[1480]: time="2025-03-21T13:36:06.069333583Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"d691e4713c7aad0d60306365138a5569b71dd3046768a6463a77ed363c1d1747\" pid:5478 exited_at:{seconds:1742564166 nanos:68955025}" Mar 21 13:36:06.241289 sshd[5457]: Connection closed by 172.24.4.1 port 55084 Mar 21 13:36:06.242393 sshd-session[5431]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:06.249859 systemd[1]: sshd@27-172.24.4.107:22-172.24.4.1:55084.service: Deactivated successfully. Mar 21 13:36:06.254287 systemd[1]: session-30.scope: Deactivated successfully. Mar 21 13:36:06.257891 systemd-logind[1460]: Session 30 logged out. Waiting for processes to exit. Mar 21 13:36:06.260550 systemd-logind[1460]: Removed session 30. Mar 21 13:36:07.850753 containerd[1480]: time="2025-03-21T13:36:07.850671220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"05c4168d6180702da618e688a549de78cd1d5a659bbc12dd94c949e217403886\" pid:5504 exited_at:{seconds:1742564167 nanos:849840697}" Mar 21 13:36:11.260090 systemd[1]: Started sshd@28-172.24.4.107:22-172.24.4.1:55096.service - OpenSSH per-connection server daemon (172.24.4.1:55096). Mar 21 13:36:12.653614 sshd[5516]: Accepted publickey for core from 172.24.4.1 port 55096 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:12.657480 sshd-session[5516]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:12.673351 systemd-logind[1460]: New session 31 of user core. Mar 21 13:36:12.678808 systemd[1]: Started session-31.scope - Session 31 of User core. Mar 21 13:36:13.529430 sshd[5518]: Connection closed by 172.24.4.1 port 55096 Mar 21 13:36:13.530645 sshd-session[5516]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:13.536254 systemd[1]: sshd@28-172.24.4.107:22-172.24.4.1:55096.service: Deactivated successfully. Mar 21 13:36:13.538971 systemd[1]: session-31.scope: Deactivated successfully. Mar 21 13:36:13.540355 systemd-logind[1460]: Session 31 logged out. Waiting for processes to exit. Mar 21 13:36:13.541985 systemd-logind[1460]: Removed session 31. Mar 21 13:36:18.551521 systemd[1]: Started sshd@29-172.24.4.107:22-172.24.4.1:59260.service - OpenSSH per-connection server daemon (172.24.4.1:59260). Mar 21 13:36:19.909842 sshd[5530]: Accepted publickey for core from 172.24.4.1 port 59260 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:19.913384 sshd-session[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:19.927560 systemd-logind[1460]: New session 32 of user core. Mar 21 13:36:19.940819 systemd[1]: Started session-32.scope - Session 32 of User core. Mar 21 13:36:20.610163 sshd[5532]: Connection closed by 172.24.4.1 port 59260 Mar 21 13:36:20.610040 sshd-session[5530]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:20.613007 systemd[1]: sshd@29-172.24.4.107:22-172.24.4.1:59260.service: Deactivated successfully. Mar 21 13:36:20.615924 systemd[1]: session-32.scope: Deactivated successfully. Mar 21 13:36:20.618703 systemd-logind[1460]: Session 32 logged out. Waiting for processes to exit. Mar 21 13:36:20.619853 systemd-logind[1460]: Removed session 32. Mar 21 13:36:25.637273 systemd[1]: Started sshd@30-172.24.4.107:22-172.24.4.1:59948.service - OpenSSH per-connection server daemon (172.24.4.1:59948). Mar 21 13:36:26.957623 sshd[5543]: Accepted publickey for core from 172.24.4.1 port 59948 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:26.960419 sshd-session[5543]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:26.973534 systemd-logind[1460]: New session 33 of user core. Mar 21 13:36:26.979740 systemd[1]: Started session-33.scope - Session 33 of User core. Mar 21 13:36:27.681027 sshd[5545]: Connection closed by 172.24.4.1 port 59948 Mar 21 13:36:27.682120 sshd-session[5543]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:27.689363 systemd[1]: sshd@30-172.24.4.107:22-172.24.4.1:59948.service: Deactivated successfully. Mar 21 13:36:27.695923 systemd[1]: session-33.scope: Deactivated successfully. Mar 21 13:36:27.699140 systemd-logind[1460]: Session 33 logged out. Waiting for processes to exit. Mar 21 13:36:27.704141 systemd-logind[1460]: Removed session 33. Mar 21 13:36:32.702367 systemd[1]: Started sshd@31-172.24.4.107:22-172.24.4.1:59960.service - OpenSSH per-connection server daemon (172.24.4.1:59960). Mar 21 13:36:34.022021 sshd[5557]: Accepted publickey for core from 172.24.4.1 port 59960 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:34.025049 sshd-session[5557]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:34.037553 systemd-logind[1460]: New session 34 of user core. Mar 21 13:36:34.043799 systemd[1]: Started session-34.scope - Session 34 of User core. Mar 21 13:36:34.819520 sshd[5559]: Connection closed by 172.24.4.1 port 59960 Mar 21 13:36:34.820880 sshd-session[5557]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:34.828718 systemd[1]: sshd@31-172.24.4.107:22-172.24.4.1:59960.service: Deactivated successfully. Mar 21 13:36:34.833073 systemd[1]: session-34.scope: Deactivated successfully. Mar 21 13:36:34.835046 systemd-logind[1460]: Session 34 logged out. Waiting for processes to exit. Mar 21 13:36:34.838644 systemd-logind[1460]: Removed session 34. Mar 21 13:36:35.309328 containerd[1480]: time="2025-03-21T13:36:35.309158249Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"541952ca95cf22c69b2031a8df975279872ae605aa9de2850289d353c7850e69\" pid:5582 exited_at:{seconds:1742564195 nanos:306401786}" Mar 21 13:36:36.063530 containerd[1480]: time="2025-03-21T13:36:36.063463462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"3cc1540522d8ea8f6a4938654a19f93312c876b7816b59c4d39522d4a54c192f\" pid:5603 exited_at:{seconds:1742564196 nanos:63136490}" Mar 21 13:36:39.844735 systemd[1]: Started sshd@32-172.24.4.107:22-172.24.4.1:44442.service - OpenSSH per-connection server daemon (172.24.4.1:44442). Mar 21 13:36:41.135966 sshd[5616]: Accepted publickey for core from 172.24.4.1 port 44442 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:41.138895 sshd-session[5616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:41.151945 systemd-logind[1460]: New session 35 of user core. Mar 21 13:36:41.159752 systemd[1]: Started session-35.scope - Session 35 of User core. Mar 21 13:36:41.900481 sshd[5620]: Connection closed by 172.24.4.1 port 44442 Mar 21 13:36:41.901591 sshd-session[5616]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:41.909087 systemd[1]: sshd@32-172.24.4.107:22-172.24.4.1:44442.service: Deactivated successfully. Mar 21 13:36:41.913755 systemd[1]: session-35.scope: Deactivated successfully. Mar 21 13:36:41.916209 systemd-logind[1460]: Session 35 logged out. Waiting for processes to exit. Mar 21 13:36:41.919016 systemd-logind[1460]: Removed session 35. Mar 21 13:36:46.926694 systemd[1]: Started sshd@33-172.24.4.107:22-172.24.4.1:34152.service - OpenSSH per-connection server daemon (172.24.4.1:34152). Mar 21 13:36:48.274028 sshd[5632]: Accepted publickey for core from 172.24.4.1 port 34152 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:48.276866 sshd-session[5632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:48.287587 systemd-logind[1460]: New session 36 of user core. Mar 21 13:36:48.295942 systemd[1]: Started session-36.scope - Session 36 of User core. Mar 21 13:36:49.030945 sshd[5634]: Connection closed by 172.24.4.1 port 34152 Mar 21 13:36:49.032046 sshd-session[5632]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:49.040537 systemd[1]: sshd@33-172.24.4.107:22-172.24.4.1:34152.service: Deactivated successfully. Mar 21 13:36:49.044839 systemd[1]: session-36.scope: Deactivated successfully. Mar 21 13:36:49.047717 systemd-logind[1460]: Session 36 logged out. Waiting for processes to exit. Mar 21 13:36:49.050641 systemd-logind[1460]: Removed session 36. Mar 21 13:36:50.540053 containerd[1480]: time="2025-03-21T13:36:50.539872056Z" level=warning msg="container event discarded" container=d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331 type=CONTAINER_CREATED_EVENT Mar 21 13:36:50.551377 containerd[1480]: time="2025-03-21T13:36:50.551316116Z" level=warning msg="container event discarded" container=d2f5e52749ea26c8d0dcba5b3b0781e4ff1e1ae94e053d89948befa533324331 type=CONTAINER_STARTED_EVENT Mar 21 13:36:50.562739 containerd[1480]: time="2025-03-21T13:36:50.562633230Z" level=warning msg="container event discarded" container=2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c type=CONTAINER_CREATED_EVENT Mar 21 13:36:50.562739 containerd[1480]: time="2025-03-21T13:36:50.562694565Z" level=warning msg="container event discarded" container=2d199e12476ad7c9cf9d1af444402f5a8ebe85503b8f121ffbc50775024ea94c type=CONTAINER_STARTED_EVENT Mar 21 13:36:50.562739 containerd[1480]: time="2025-03-21T13:36:50.562717347Z" level=warning msg="container event discarded" container=3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0 type=CONTAINER_CREATED_EVENT Mar 21 13:36:50.562739 containerd[1480]: time="2025-03-21T13:36:50.562737014Z" level=warning msg="container event discarded" container=3dd6cbcf37b2177d9e8d4de8c65348677b445ae455329a5861428f4a5c6cf5b0 type=CONTAINER_STARTED_EVENT Mar 21 13:36:50.599177 containerd[1480]: time="2025-03-21T13:36:50.599060038Z" level=warning msg="container event discarded" container=9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1 type=CONTAINER_CREATED_EVENT Mar 21 13:36:50.599177 containerd[1480]: time="2025-03-21T13:36:50.599166608Z" level=warning msg="container event discarded" container=3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df type=CONTAINER_CREATED_EVENT Mar 21 13:36:50.627594 containerd[1480]: time="2025-03-21T13:36:50.627486310Z" level=warning msg="container event discarded" container=93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0 type=CONTAINER_CREATED_EVENT Mar 21 13:36:50.714165 containerd[1480]: time="2025-03-21T13:36:50.714070724Z" level=warning msg="container event discarded" container=9d715641f0aec4c114cc19ea9f86721519c854a10dfa8609c0ad200986ffcdb1 type=CONTAINER_STARTED_EVENT Mar 21 13:36:50.748922 containerd[1480]: time="2025-03-21T13:36:50.748768963Z" level=warning msg="container event discarded" container=3394bc2c134f32fa30e03dd4de582abd748cc62eceea48afca90ce9b659343df type=CONTAINER_STARTED_EVENT Mar 21 13:36:50.749090 containerd[1480]: time="2025-03-21T13:36:50.748915167Z" level=warning msg="container event discarded" container=93ef758a66af93eba7f5a5b0d6e726b6a8f77f28e767809b7e8bd7e6ec0f6ca0 type=CONTAINER_STARTED_EVENT Mar 21 13:36:54.055022 systemd[1]: Started sshd@34-172.24.4.107:22-172.24.4.1:38628.service - OpenSSH per-connection server daemon (172.24.4.1:38628). Mar 21 13:36:55.358649 sshd[5645]: Accepted publickey for core from 172.24.4.1 port 38628 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:36:55.361608 sshd-session[5645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:36:55.372949 systemd-logind[1460]: New session 37 of user core. Mar 21 13:36:55.384921 systemd[1]: Started session-37.scope - Session 37 of User core. Mar 21 13:36:56.196738 sshd[5647]: Connection closed by 172.24.4.1 port 38628 Mar 21 13:36:56.197616 sshd-session[5645]: pam_unix(sshd:session): session closed for user core Mar 21 13:36:56.207032 systemd[1]: sshd@34-172.24.4.107:22-172.24.4.1:38628.service: Deactivated successfully. Mar 21 13:36:56.211666 systemd[1]: session-37.scope: Deactivated successfully. Mar 21 13:36:56.215766 systemd-logind[1460]: Session 37 logged out. Waiting for processes to exit. Mar 21 13:36:56.220898 systemd-logind[1460]: Removed session 37. Mar 21 13:37:01.219398 systemd[1]: Started sshd@35-172.24.4.107:22-172.24.4.1:38634.service - OpenSSH per-connection server daemon (172.24.4.1:38634). Mar 21 13:37:02.526190 sshd[5661]: Accepted publickey for core from 172.24.4.1 port 38634 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:02.529185 sshd-session[5661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:02.541365 systemd-logind[1460]: New session 38 of user core. Mar 21 13:37:02.548735 systemd[1]: Started session-38.scope - Session 38 of User core. Mar 21 13:37:03.274290 sshd[5663]: Connection closed by 172.24.4.1 port 38634 Mar 21 13:37:03.273796 sshd-session[5661]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:03.277401 systemd[1]: sshd@35-172.24.4.107:22-172.24.4.1:38634.service: Deactivated successfully. Mar 21 13:37:03.279346 systemd[1]: session-38.scope: Deactivated successfully. Mar 21 13:37:03.281803 systemd-logind[1460]: Session 38 logged out. Waiting for processes to exit. Mar 21 13:37:03.283278 systemd-logind[1460]: Removed session 38. Mar 21 13:37:05.314329 containerd[1480]: time="2025-03-21T13:37:05.314279514Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"2fdc7a8c96e73b4a5e8152742d13e173962f79a0bb50408629b883d46065477c\" pid:5686 exited_at:{seconds:1742564225 nanos:314062878}" Mar 21 13:37:06.016339 containerd[1480]: time="2025-03-21T13:37:06.016277536Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"e3736318b8144a1e94b7b32ca78e89ec5e8254335e67d184c2251e3a2e0b571e\" pid:5709 exited_at:{seconds:1742564226 nanos:15733756}" Mar 21 13:37:07.855117 containerd[1480]: time="2025-03-21T13:37:07.855076239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"c5cf8b58827edad54bb18f3624d81ae18d31cc5927f3969d864dcbe48128f064\" pid:5734 exited_at:{seconds:1742564227 nanos:854899047}" Mar 21 13:37:08.295786 systemd[1]: Started sshd@36-172.24.4.107:22-172.24.4.1:35542.service - OpenSSH per-connection server daemon (172.24.4.1:35542). Mar 21 13:37:09.429118 sshd[5744]: Accepted publickey for core from 172.24.4.1 port 35542 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:09.431867 sshd-session[5744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:09.444559 systemd-logind[1460]: New session 39 of user core. Mar 21 13:37:09.452751 systemd[1]: Started session-39.scope - Session 39 of User core. Mar 21 13:37:10.306634 sshd[5746]: Connection closed by 172.24.4.1 port 35542 Mar 21 13:37:10.307124 sshd-session[5744]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:10.314394 systemd-logind[1460]: Session 39 logged out. Waiting for processes to exit. Mar 21 13:37:10.316754 systemd[1]: sshd@36-172.24.4.107:22-172.24.4.1:35542.service: Deactivated successfully. Mar 21 13:37:10.323408 systemd[1]: session-39.scope: Deactivated successfully. Mar 21 13:37:10.330078 systemd-logind[1460]: Removed session 39. Mar 21 13:37:10.642462 containerd[1480]: time="2025-03-21T13:37:10.642241750Z" level=warning msg="container event discarded" container=231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa type=CONTAINER_CREATED_EVENT Mar 21 13:37:10.642462 containerd[1480]: time="2025-03-21T13:37:10.642390018Z" level=warning msg="container event discarded" container=231c624399113ae18968ebe36de4fe4a590af5fbea68bf60d6eb8a1cb2c900aa type=CONTAINER_STARTED_EVENT Mar 21 13:37:10.677788 containerd[1480]: time="2025-03-21T13:37:10.677701416Z" level=warning msg="container event discarded" container=29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867 type=CONTAINER_CREATED_EVENT Mar 21 13:37:10.777197 containerd[1480]: time="2025-03-21T13:37:10.777079222Z" level=warning msg="container event discarded" container=29fd3fb4e4214769d3e9b47236da68041c2cb95f2f4109019627b9f1ee06a867 type=CONTAINER_STARTED_EVENT Mar 21 13:37:10.810817 containerd[1480]: time="2025-03-21T13:37:10.810672026Z" level=warning msg="container event discarded" container=0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636 type=CONTAINER_CREATED_EVENT Mar 21 13:37:10.810817 containerd[1480]: time="2025-03-21T13:37:10.810763548Z" level=warning msg="container event discarded" container=0b4101624f5acda3e039e0773aeb97b8ba80c9ef707bdbf2389575ad15897636 type=CONTAINER_STARTED_EVENT Mar 21 13:37:15.201591 containerd[1480]: time="2025-03-21T13:37:15.201421487Z" level=warning msg="container event discarded" container=d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357 type=CONTAINER_CREATED_EVENT Mar 21 13:37:15.265500 containerd[1480]: time="2025-03-21T13:37:15.265268478Z" level=warning msg="container event discarded" container=d542ad8f2648760316e4977a02dfdd8fee452bd639e0c4a35733a24d50a15357 type=CONTAINER_STARTED_EVENT Mar 21 13:37:15.327875 systemd[1]: Started sshd@37-172.24.4.107:22-172.24.4.1:52706.service - OpenSSH per-connection server daemon (172.24.4.1:52706). Mar 21 13:37:16.535673 sshd[5760]: Accepted publickey for core from 172.24.4.1 port 52706 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:16.539019 sshd-session[5760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:16.552584 systemd-logind[1460]: New session 40 of user core. Mar 21 13:37:16.558745 systemd[1]: Started session-40.scope - Session 40 of User core. Mar 21 13:37:17.282894 sshd[5762]: Connection closed by 172.24.4.1 port 52706 Mar 21 13:37:17.284046 sshd-session[5760]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:17.292270 systemd-logind[1460]: Session 40 logged out. Waiting for processes to exit. Mar 21 13:37:17.293986 systemd[1]: sshd@37-172.24.4.107:22-172.24.4.1:52706.service: Deactivated successfully. Mar 21 13:37:17.299699 systemd[1]: session-40.scope: Deactivated successfully. Mar 21 13:37:17.302969 systemd-logind[1460]: Removed session 40. Mar 21 13:37:20.747037 containerd[1480]: time="2025-03-21T13:37:20.746831863Z" level=warning msg="container event discarded" container=d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b type=CONTAINER_CREATED_EVENT Mar 21 13:37:20.747037 containerd[1480]: time="2025-03-21T13:37:20.746969972Z" level=warning msg="container event discarded" container=d15208e92e8dd05939f378789987a9d52997ab3b1a2d2ac4f78fbb454c18733b type=CONTAINER_STARTED_EVENT Mar 21 13:37:20.825339 containerd[1480]: time="2025-03-21T13:37:20.825229614Z" level=warning msg="container event discarded" container=5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a type=CONTAINER_CREATED_EVENT Mar 21 13:37:20.825339 containerd[1480]: time="2025-03-21T13:37:20.825316708Z" level=warning msg="container event discarded" container=5f047371f53fe4635c0bc00385aa2d0412afe5a1fce0e8302e7cfefefe0bfe4a type=CONTAINER_STARTED_EVENT Mar 21 13:37:22.305071 systemd[1]: Started sshd@38-172.24.4.107:22-172.24.4.1:52712.service - OpenSSH per-connection server daemon (172.24.4.1:52712). Mar 21 13:37:22.944625 containerd[1480]: time="2025-03-21T13:37:22.944493168Z" level=warning msg="container event discarded" container=96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7 type=CONTAINER_CREATED_EVENT Mar 21 13:37:23.044070 containerd[1480]: time="2025-03-21T13:37:23.043942341Z" level=warning msg="container event discarded" container=96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7 type=CONTAINER_STARTED_EVENT Mar 21 13:37:23.499619 containerd[1480]: time="2025-03-21T13:37:23.499509113Z" level=warning msg="container event discarded" container=96948ca349177599143c9faf58ad87bdfa985a5556e81161c38b2eda3108b3e7 type=CONTAINER_STOPPED_EVENT Mar 21 13:37:23.688933 sshd[5774]: Accepted publickey for core from 172.24.4.1 port 52712 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:23.691750 sshd-session[5774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:23.703539 systemd-logind[1460]: New session 41 of user core. Mar 21 13:37:23.711734 systemd[1]: Started session-41.scope - Session 41 of User core. Mar 21 13:37:24.311523 sshd[5776]: Connection closed by 172.24.4.1 port 52712 Mar 21 13:37:24.312678 sshd-session[5774]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:24.319690 systemd[1]: sshd@38-172.24.4.107:22-172.24.4.1:52712.service: Deactivated successfully. Mar 21 13:37:24.323894 systemd[1]: session-41.scope: Deactivated successfully. Mar 21 13:37:24.326815 systemd-logind[1460]: Session 41 logged out. Waiting for processes to exit. Mar 21 13:37:24.330064 systemd-logind[1460]: Removed session 41. Mar 21 13:37:26.345850 containerd[1480]: time="2025-03-21T13:37:26.345736264Z" level=warning msg="container event discarded" container=dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada type=CONTAINER_CREATED_EVENT Mar 21 13:37:26.439685 containerd[1480]: time="2025-03-21T13:37:26.439594451Z" level=warning msg="container event discarded" container=dd18038fd36a59a4f641184f874543283a56b4ecb6d981fd87083f23e63c1ada type=CONTAINER_STARTED_EVENT Mar 21 13:37:29.338843 systemd[1]: Started sshd@39-172.24.4.107:22-172.24.4.1:42852.service - OpenSSH per-connection server daemon (172.24.4.1:42852). Mar 21 13:37:30.540964 sshd[5804]: Accepted publickey for core from 172.24.4.1 port 42852 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:30.544039 sshd-session[5804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:30.556534 systemd-logind[1460]: New session 42 of user core. Mar 21 13:37:30.564772 systemd[1]: Started session-42.scope - Session 42 of User core. Mar 21 13:37:31.287959 sshd[5806]: Connection closed by 172.24.4.1 port 42852 Mar 21 13:37:31.289306 sshd-session[5804]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:31.294213 systemd-logind[1460]: Session 42 logged out. Waiting for processes to exit. Mar 21 13:37:31.295336 systemd[1]: sshd@39-172.24.4.107:22-172.24.4.1:42852.service: Deactivated successfully. Mar 21 13:37:31.300321 systemd[1]: session-42.scope: Deactivated successfully. Mar 21 13:37:31.304075 systemd-logind[1460]: Removed session 42. Mar 21 13:37:32.715044 containerd[1480]: time="2025-03-21T13:37:32.714934825Z" level=warning msg="container event discarded" container=050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370 type=CONTAINER_CREATED_EVENT Mar 21 13:37:32.797496 containerd[1480]: time="2025-03-21T13:37:32.797317959Z" level=warning msg="container event discarded" container=050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370 type=CONTAINER_STARTED_EVENT Mar 21 13:37:35.173712 containerd[1480]: time="2025-03-21T13:37:35.173621764Z" level=warning msg="container event discarded" container=050cf1559b9b2fdf0bd1e2b110f3685bd198beb4068fa04c01d878912db30370 type=CONTAINER_STOPPED_EVENT Mar 21 13:37:35.307210 containerd[1480]: time="2025-03-21T13:37:35.306947772Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"97e88b24e15353ceaa3fad46fef7c0e3f1292cfba0db9c2ad5aa324d9617eac1\" pid:5829 exited_at:{seconds:1742564255 nanos:306560384}" Mar 21 13:37:36.067077 containerd[1480]: time="2025-03-21T13:37:36.066987967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"698b6b8ac67745567e1764890651eabb3da4d1fb70ec37d7ec23b4934a03c552\" pid:5850 exited_at:{seconds:1742564256 nanos:66477920}" Mar 21 13:37:36.307711 systemd[1]: Started sshd@40-172.24.4.107:22-172.24.4.1:36770.service - OpenSSH per-connection server daemon (172.24.4.1:36770). Mar 21 13:37:37.449628 sshd[5863]: Accepted publickey for core from 172.24.4.1 port 36770 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:37.452584 sshd-session[5863]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:37.463589 systemd-logind[1460]: New session 43 of user core. Mar 21 13:37:37.469662 systemd[1]: Started session-43.scope - Session 43 of User core. Mar 21 13:37:38.297472 sshd[5865]: Connection closed by 172.24.4.1 port 36770 Mar 21 13:37:38.298495 sshd-session[5863]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:38.304866 systemd-logind[1460]: Session 43 logged out. Waiting for processes to exit. Mar 21 13:37:38.306195 systemd[1]: sshd@40-172.24.4.107:22-172.24.4.1:36770.service: Deactivated successfully. Mar 21 13:37:38.311018 systemd[1]: session-43.scope: Deactivated successfully. Mar 21 13:37:38.313680 systemd-logind[1460]: Removed session 43. Mar 21 13:37:43.323901 systemd[1]: Started sshd@41-172.24.4.107:22-172.24.4.1:36774.service - OpenSSH per-connection server daemon (172.24.4.1:36774). Mar 21 13:37:44.417831 containerd[1480]: time="2025-03-21T13:37:44.417671890Z" level=warning msg="container event discarded" container=1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a type=CONTAINER_CREATED_EVENT Mar 21 13:37:44.521607 containerd[1480]: time="2025-03-21T13:37:44.521530830Z" level=warning msg="container event discarded" container=1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a type=CONTAINER_STARTED_EVENT Mar 21 13:37:44.626958 sshd[5879]: Accepted publickey for core from 172.24.4.1 port 36774 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:44.629968 sshd-session[5879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:44.641771 systemd-logind[1460]: New session 44 of user core. Mar 21 13:37:44.651740 systemd[1]: Started session-44.scope - Session 44 of User core. Mar 21 13:37:45.274526 sshd[5881]: Connection closed by 172.24.4.1 port 36774 Mar 21 13:37:45.275523 sshd-session[5879]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:45.279849 systemd[1]: sshd@41-172.24.4.107:22-172.24.4.1:36774.service: Deactivated successfully. Mar 21 13:37:45.281810 systemd[1]: session-44.scope: Deactivated successfully. Mar 21 13:37:45.283280 systemd-logind[1460]: Session 44 logged out. Waiting for processes to exit. Mar 21 13:37:45.285700 systemd-logind[1460]: Removed session 44. Mar 21 13:37:47.648428 containerd[1480]: time="2025-03-21T13:37:47.648298436Z" level=warning msg="container event discarded" container=6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a type=CONTAINER_CREATED_EVENT Mar 21 13:37:47.648428 containerd[1480]: time="2025-03-21T13:37:47.648378947Z" level=warning msg="container event discarded" container=6ad60e204b8d600e780142d1ea0386f2753126c41a4947c70344d9ae444a506a type=CONTAINER_STARTED_EVENT Mar 21 13:37:49.107836 containerd[1480]: time="2025-03-21T13:37:49.107731605Z" level=warning msg="container event discarded" container=d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102 type=CONTAINER_CREATED_EVENT Mar 21 13:37:49.107836 containerd[1480]: time="2025-03-21T13:37:49.107811084Z" level=warning msg="container event discarded" container=d1c62e2204c440250d7c1e3f1951df2511f7fa19576746956dd13314281c8102 type=CONTAINER_STARTED_EVENT Mar 21 13:37:49.223900 containerd[1480]: time="2025-03-21T13:37:49.223760512Z" level=warning msg="container event discarded" container=49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8 type=CONTAINER_CREATED_EVENT Mar 21 13:37:49.268510 containerd[1480]: time="2025-03-21T13:37:49.268318723Z" level=warning msg="container event discarded" container=96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425 type=CONTAINER_CREATED_EVENT Mar 21 13:37:49.268510 containerd[1480]: time="2025-03-21T13:37:49.268398633Z" level=warning msg="container event discarded" container=96c87f782d48f0be73a03b008e0cb1c0036f92e937d842cb69d7f1f90ff4a425 type=CONTAINER_STARTED_EVENT Mar 21 13:37:49.381042 containerd[1480]: time="2025-03-21T13:37:49.380802045Z" level=warning msg="container event discarded" container=49babdf5ea7ebdac7d33dc87a0ee56439d30bcb04975223f02da966f9559adc8 type=CONTAINER_STARTED_EVENT Mar 21 13:37:49.508204 containerd[1480]: time="2025-03-21T13:37:49.507980058Z" level=warning msg="container event discarded" container=029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235 type=CONTAINER_CREATED_EVENT Mar 21 13:37:49.508204 containerd[1480]: time="2025-03-21T13:37:49.508045971Z" level=warning msg="container event discarded" container=029fe5b5cb9adad833d0ec9a08e69611783fc209411d552cb249d661605a9235 type=CONTAINER_STARTED_EVENT Mar 21 13:37:49.563255 containerd[1480]: time="2025-03-21T13:37:49.563152632Z" level=warning msg="container event discarded" container=7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b type=CONTAINER_CREATED_EVENT Mar 21 13:37:49.563255 containerd[1480]: time="2025-03-21T13:37:49.563225168Z" level=warning msg="container event discarded" container=7cf26cac9bbad5b591b59a09808132ba700987ef4ab18a74a12871d02c187f6b type=CONTAINER_STARTED_EVENT Mar 21 13:37:50.296753 systemd[1]: Started sshd@42-172.24.4.107:22-172.24.4.1:42942.service - OpenSSH per-connection server daemon (172.24.4.1:42942). Mar 21 13:37:50.407476 containerd[1480]: time="2025-03-21T13:37:50.407308071Z" level=warning msg="container event discarded" container=9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef type=CONTAINER_CREATED_EVENT Mar 21 13:37:50.592941 containerd[1480]: time="2025-03-21T13:37:50.592695561Z" level=warning msg="container event discarded" container=9612c2493a0ce2c246fac295f19a8a83791fc1a27cccf09321eaf238887c3eef type=CONTAINER_STARTED_EVENT Mar 21 13:37:51.512049 sshd[5894]: Accepted publickey for core from 172.24.4.1 port 42942 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:51.518565 sshd-session[5894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:51.533578 systemd-logind[1460]: New session 45 of user core. Mar 21 13:37:51.540881 systemd[1]: Started session-45.scope - Session 45 of User core. Mar 21 13:37:51.646668 containerd[1480]: time="2025-03-21T13:37:51.646545474Z" level=warning msg="container event discarded" container=c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1 type=CONTAINER_CREATED_EVENT Mar 21 13:37:51.646668 containerd[1480]: time="2025-03-21T13:37:51.646619352Z" level=warning msg="container event discarded" container=c6fb6e7c9e04820fe83cb45ff65e7cb57d39a7f373151890c20fe789b78434d1 type=CONTAINER_STARTED_EVENT Mar 21 13:37:51.892516 containerd[1480]: time="2025-03-21T13:37:51.892270752Z" level=warning msg="container event discarded" container=742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e type=CONTAINER_CREATED_EVENT Mar 21 13:37:51.971721 containerd[1480]: time="2025-03-21T13:37:51.971623273Z" level=warning msg="container event discarded" container=742396fc17cc631084480aa87fcf5f3de4375d23ed8eb6ebf989fc112fef794e type=CONTAINER_STARTED_EVENT Mar 21 13:37:52.273608 sshd[5896]: Connection closed by 172.24.4.1 port 42942 Mar 21 13:37:52.274665 sshd-session[5894]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:52.281210 systemd[1]: sshd@42-172.24.4.107:22-172.24.4.1:42942.service: Deactivated successfully. Mar 21 13:37:52.285242 systemd[1]: session-45.scope: Deactivated successfully. Mar 21 13:37:52.289625 systemd-logind[1460]: Session 45 logged out. Waiting for processes to exit. Mar 21 13:37:52.292206 systemd-logind[1460]: Removed session 45. Mar 21 13:37:57.146105 containerd[1480]: time="2025-03-21T13:37:57.145988520Z" level=warning msg="container event discarded" container=27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29 type=CONTAINER_CREATED_EVENT Mar 21 13:37:57.241509 containerd[1480]: time="2025-03-21T13:37:57.241311281Z" level=warning msg="container event discarded" container=27ab9c9fc5bfadebb82fcc6c96a1e60f17cea6b086a51661b42341011f836f29 type=CONTAINER_STARTED_EVENT Mar 21 13:37:57.299164 systemd[1]: Started sshd@43-172.24.4.107:22-172.24.4.1:39358.service - OpenSSH per-connection server daemon (172.24.4.1:39358). Mar 21 13:37:58.507760 sshd[5910]: Accepted publickey for core from 172.24.4.1 port 39358 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:37:58.509214 sshd-session[5910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:37:58.519695 systemd-logind[1460]: New session 46 of user core. Mar 21 13:37:58.526715 systemd[1]: Started session-46.scope - Session 46 of User core. Mar 21 13:37:59.197409 sshd[5912]: Connection closed by 172.24.4.1 port 39358 Mar 21 13:37:59.198042 sshd-session[5910]: pam_unix(sshd:session): session closed for user core Mar 21 13:37:59.202046 systemd-logind[1460]: Session 46 logged out. Waiting for processes to exit. Mar 21 13:37:59.202741 systemd[1]: sshd@43-172.24.4.107:22-172.24.4.1:39358.service: Deactivated successfully. Mar 21 13:37:59.205103 systemd[1]: session-46.scope: Deactivated successfully. Mar 21 13:37:59.206944 systemd-logind[1460]: Removed session 46. Mar 21 13:38:02.360944 containerd[1480]: time="2025-03-21T13:38:02.360760432Z" level=warning msg="container event discarded" container=2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b type=CONTAINER_CREATED_EVENT Mar 21 13:38:02.440658 containerd[1480]: time="2025-03-21T13:38:02.440539896Z" level=warning msg="container event discarded" container=2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b type=CONTAINER_STARTED_EVENT Mar 21 13:38:02.825996 containerd[1480]: time="2025-03-21T13:38:02.825761090Z" level=warning msg="container event discarded" container=2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd type=CONTAINER_CREATED_EVENT Mar 21 13:38:03.024581 containerd[1480]: time="2025-03-21T13:38:03.024387456Z" level=warning msg="container event discarded" container=2119f90383043a4828d76138afb22a1f821857ce01a1ef1d0b89efaf899edbfd type=CONTAINER_STARTED_EVENT Mar 21 13:38:04.221168 systemd[1]: Started sshd@44-172.24.4.107:22-172.24.4.1:34160.service - OpenSSH per-connection server daemon (172.24.4.1:34160). Mar 21 13:38:05.310326 containerd[1480]: time="2025-03-21T13:38:05.310268016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"d40047989c1fe3e2c0eef0ca9b42a18a628e6321974b6d310eaa837829bbd2f5\" pid:5937 exited_at:{seconds:1742564285 nanos:309959426}" Mar 21 13:38:05.410433 containerd[1480]: time="2025-03-21T13:38:05.410269667Z" level=warning msg="container event discarded" container=3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a type=CONTAINER_CREATED_EVENT Mar 21 13:38:05.500696 containerd[1480]: time="2025-03-21T13:38:05.500567903Z" level=warning msg="container event discarded" container=3c6d4dd547dec9782e1e905dc501dd2ab9992e0c1f0cf6191837ef09a86b3a3a type=CONTAINER_STARTED_EVENT Mar 21 13:38:05.528342 sshd[5923]: Accepted publickey for core from 172.24.4.1 port 34160 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:05.531316 sshd-session[5923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:05.543278 systemd-logind[1460]: New session 47 of user core. Mar 21 13:38:05.554761 systemd[1]: Started session-47.scope - Session 47 of User core. Mar 21 13:38:06.087926 containerd[1480]: time="2025-03-21T13:38:06.087749029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"e11bc2abb65ea7eece1052062e33f08df89110bef92200a39cf8784d9658ee3b\" pid:5965 exited_at:{seconds:1742564286 nanos:87314262}" Mar 21 13:38:06.275953 sshd[5946]: Connection closed by 172.24.4.1 port 34160 Mar 21 13:38:06.276536 sshd-session[5923]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:06.279372 systemd[1]: sshd@44-172.24.4.107:22-172.24.4.1:34160.service: Deactivated successfully. Mar 21 13:38:06.281259 systemd[1]: session-47.scope: Deactivated successfully. Mar 21 13:38:06.283842 systemd-logind[1460]: Session 47 logged out. Waiting for processes to exit. Mar 21 13:38:06.286350 systemd-logind[1460]: Removed session 47. Mar 21 13:38:07.847276 containerd[1480]: time="2025-03-21T13:38:07.847134452Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"15c48b73d7b2bc8789725241cd92e238591819c5f03bf84c4ddc60b82f0c9f98\" pid:5999 exited_at:{seconds:1742564287 nanos:845906907}" Mar 21 13:38:11.297025 systemd[1]: Started sshd@45-172.24.4.107:22-172.24.4.1:34176.service - OpenSSH per-connection server daemon (172.24.4.1:34176). Mar 21 13:38:12.507099 sshd[6011]: Accepted publickey for core from 172.24.4.1 port 34176 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:12.510247 sshd-session[6011]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:12.522116 systemd-logind[1460]: New session 48 of user core. Mar 21 13:38:12.526746 systemd[1]: Started session-48.scope - Session 48 of User core. Mar 21 13:38:13.395023 sshd[6015]: Connection closed by 172.24.4.1 port 34176 Mar 21 13:38:13.398201 sshd-session[6011]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:13.404251 systemd[1]: sshd@45-172.24.4.107:22-172.24.4.1:34176.service: Deactivated successfully. Mar 21 13:38:13.409887 systemd[1]: session-48.scope: Deactivated successfully. Mar 21 13:38:13.413357 systemd-logind[1460]: Session 48 logged out. Waiting for processes to exit. Mar 21 13:38:13.416285 systemd-logind[1460]: Removed session 48. Mar 21 13:38:18.419002 systemd[1]: Started sshd@46-172.24.4.107:22-172.24.4.1:59938.service - OpenSSH per-connection server daemon (172.24.4.1:59938). Mar 21 13:38:19.631085 sshd[6027]: Accepted publickey for core from 172.24.4.1 port 59938 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:19.633953 sshd-session[6027]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:19.645964 systemd-logind[1460]: New session 49 of user core. Mar 21 13:38:19.652781 systemd[1]: Started session-49.scope - Session 49 of User core. Mar 21 13:38:20.232481 sshd[6029]: Connection closed by 172.24.4.1 port 59938 Mar 21 13:38:20.233524 sshd-session[6027]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:20.251303 systemd[1]: sshd@46-172.24.4.107:22-172.24.4.1:59938.service: Deactivated successfully. Mar 21 13:38:20.258950 systemd[1]: session-49.scope: Deactivated successfully. Mar 21 13:38:20.261487 systemd-logind[1460]: Session 49 logged out. Waiting for processes to exit. Mar 21 13:38:20.266862 systemd[1]: Started sshd@47-172.24.4.107:22-172.24.4.1:59942.service - OpenSSH per-connection server daemon (172.24.4.1:59942). Mar 21 13:38:20.271492 systemd-logind[1460]: Removed session 49. Mar 21 13:38:21.775232 sshd[6039]: Accepted publickey for core from 172.24.4.1 port 59942 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:21.777968 sshd-session[6039]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:21.790531 systemd-logind[1460]: New session 50 of user core. Mar 21 13:38:21.799741 systemd[1]: Started session-50.scope - Session 50 of User core. Mar 21 13:38:22.908485 sshd[6042]: Connection closed by 172.24.4.1 port 59942 Mar 21 13:38:22.909495 sshd-session[6039]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:22.923153 systemd[1]: sshd@47-172.24.4.107:22-172.24.4.1:59942.service: Deactivated successfully. Mar 21 13:38:22.928030 systemd[1]: session-50.scope: Deactivated successfully. Mar 21 13:38:22.932420 systemd-logind[1460]: Session 50 logged out. Waiting for processes to exit. Mar 21 13:38:22.936053 systemd[1]: Started sshd@48-172.24.4.107:22-172.24.4.1:59950.service - OpenSSH per-connection server daemon (172.24.4.1:59950). Mar 21 13:38:22.940276 systemd-logind[1460]: Removed session 50. Mar 21 13:38:24.954807 sshd[6052]: Accepted publickey for core from 172.24.4.1 port 59950 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:24.957614 sshd-session[6052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:24.970563 systemd-logind[1460]: New session 51 of user core. Mar 21 13:38:24.978754 systemd[1]: Started session-51.scope - Session 51 of User core. Mar 21 13:38:28.397214 sshd[6055]: Connection closed by 172.24.4.1 port 59950 Mar 21 13:38:28.404409 sshd-session[6052]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:28.425215 systemd[1]: sshd@48-172.24.4.107:22-172.24.4.1:59950.service: Deactivated successfully. Mar 21 13:38:28.431689 systemd[1]: session-51.scope: Deactivated successfully. Mar 21 13:38:28.432659 systemd[1]: session-51.scope: Consumed 847ms CPU time, 66.8M memory peak. Mar 21 13:38:28.436203 systemd-logind[1460]: Session 51 logged out. Waiting for processes to exit. Mar 21 13:38:28.441773 systemd[1]: Started sshd@49-172.24.4.107:22-172.24.4.1:53992.service - OpenSSH per-connection server daemon (172.24.4.1:53992). Mar 21 13:38:28.443554 systemd-logind[1460]: Removed session 51. Mar 21 13:38:29.669334 sshd[6073]: Accepted publickey for core from 172.24.4.1 port 53992 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:29.672043 sshd-session[6073]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:29.684948 systemd-logind[1460]: New session 52 of user core. Mar 21 13:38:29.689770 systemd[1]: Started session-52.scope - Session 52 of User core. Mar 21 13:38:30.679267 sshd[6076]: Connection closed by 172.24.4.1 port 53992 Mar 21 13:38:30.679666 sshd-session[6073]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:30.692714 systemd[1]: sshd@49-172.24.4.107:22-172.24.4.1:53992.service: Deactivated successfully. Mar 21 13:38:30.696553 systemd[1]: session-52.scope: Deactivated successfully. Mar 21 13:38:30.699207 systemd-logind[1460]: Session 52 logged out. Waiting for processes to exit. Mar 21 13:38:30.704651 systemd[1]: Started sshd@50-172.24.4.107:22-172.24.4.1:53994.service - OpenSSH per-connection server daemon (172.24.4.1:53994). Mar 21 13:38:30.708969 systemd-logind[1460]: Removed session 52. Mar 21 13:38:31.946495 sshd[6085]: Accepted publickey for core from 172.24.4.1 port 53994 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:31.949492 sshd-session[6085]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:31.962400 systemd-logind[1460]: New session 53 of user core. Mar 21 13:38:31.967769 systemd[1]: Started session-53.scope - Session 53 of User core. Mar 21 13:38:32.697646 sshd[6088]: Connection closed by 172.24.4.1 port 53994 Mar 21 13:38:32.698868 sshd-session[6085]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:32.705083 systemd[1]: sshd@50-172.24.4.107:22-172.24.4.1:53994.service: Deactivated successfully. Mar 21 13:38:32.707369 systemd[1]: session-53.scope: Deactivated successfully. Mar 21 13:38:32.708885 systemd-logind[1460]: Session 53 logged out. Waiting for processes to exit. Mar 21 13:38:32.710345 systemd-logind[1460]: Removed session 53. Mar 21 13:38:35.312175 containerd[1480]: time="2025-03-21T13:38:35.311734539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"a90987e5a22120fe7deff43ce93b6106834a820612cf8ad236fc397ad62a49e7\" pid:6112 exited_at:{seconds:1742564315 nanos:311032841}" Mar 21 13:38:36.062758 containerd[1480]: time="2025-03-21T13:38:36.062723547Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"1cee93ae5a37c1f2eea11b3b09a7605a948ef19129c39aef73f966ccfd1c0483\" pid:6133 exited_at:{seconds:1742564316 nanos:62207849}" Mar 21 13:38:37.718375 systemd[1]: Started sshd@51-172.24.4.107:22-172.24.4.1:50034.service - OpenSSH per-connection server daemon (172.24.4.1:50034). Mar 21 13:38:39.115166 sshd[6146]: Accepted publickey for core from 172.24.4.1 port 50034 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:39.118140 sshd-session[6146]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:39.130559 systemd-logind[1460]: New session 54 of user core. Mar 21 13:38:39.135730 systemd[1]: Started session-54.scope - Session 54 of User core. Mar 21 13:38:39.947717 sshd[6148]: Connection closed by 172.24.4.1 port 50034 Mar 21 13:38:39.948644 sshd-session[6146]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:39.954784 systemd[1]: sshd@51-172.24.4.107:22-172.24.4.1:50034.service: Deactivated successfully. Mar 21 13:38:39.961270 systemd[1]: session-54.scope: Deactivated successfully. Mar 21 13:38:39.964039 systemd-logind[1460]: Session 54 logged out. Waiting for processes to exit. Mar 21 13:38:39.967321 systemd-logind[1460]: Removed session 54. Mar 21 13:38:44.971859 systemd[1]: Started sshd@52-172.24.4.107:22-172.24.4.1:37562.service - OpenSSH per-connection server daemon (172.24.4.1:37562). Mar 21 13:38:46.495825 sshd[6162]: Accepted publickey for core from 172.24.4.1 port 37562 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:46.499194 sshd-session[6162]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:46.512038 systemd-logind[1460]: New session 55 of user core. Mar 21 13:38:46.519766 systemd[1]: Started session-55.scope - Session 55 of User core. Mar 21 13:38:47.312613 sshd[6164]: Connection closed by 172.24.4.1 port 37562 Mar 21 13:38:47.312318 sshd-session[6162]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:47.319999 systemd[1]: sshd@52-172.24.4.107:22-172.24.4.1:37562.service: Deactivated successfully. Mar 21 13:38:47.326264 systemd[1]: session-55.scope: Deactivated successfully. Mar 21 13:38:47.329053 systemd-logind[1460]: Session 55 logged out. Waiting for processes to exit. Mar 21 13:38:47.331570 systemd-logind[1460]: Removed session 55. Mar 21 13:38:52.335238 systemd[1]: Started sshd@53-172.24.4.107:22-172.24.4.1:37574.service - OpenSSH per-connection server daemon (172.24.4.1:37574). Mar 21 13:38:53.500404 sshd[6176]: Accepted publickey for core from 172.24.4.1 port 37574 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:38:53.503056 sshd-session[6176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:38:53.515551 systemd-logind[1460]: New session 56 of user core. Mar 21 13:38:53.521764 systemd[1]: Started session-56.scope - Session 56 of User core. Mar 21 13:38:54.273540 sshd[6178]: Connection closed by 172.24.4.1 port 37574 Mar 21 13:38:54.274608 sshd-session[6176]: pam_unix(sshd:session): session closed for user core Mar 21 13:38:54.281386 systemd[1]: sshd@53-172.24.4.107:22-172.24.4.1:37574.service: Deactivated successfully. Mar 21 13:38:54.284913 systemd[1]: session-56.scope: Deactivated successfully. Mar 21 13:38:54.286697 systemd-logind[1460]: Session 56 logged out. Waiting for processes to exit. Mar 21 13:38:54.289161 systemd-logind[1460]: Removed session 56. Mar 21 13:38:59.295364 systemd[1]: Started sshd@54-172.24.4.107:22-172.24.4.1:57248.service - OpenSSH per-connection server daemon (172.24.4.1:57248). Mar 21 13:39:00.686865 sshd[6191]: Accepted publickey for core from 172.24.4.1 port 57248 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:00.689595 sshd-session[6191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:00.700579 systemd-logind[1460]: New session 57 of user core. Mar 21 13:39:00.705935 systemd[1]: Started session-57.scope - Session 57 of User core. Mar 21 13:39:01.393405 sshd[6193]: Connection closed by 172.24.4.1 port 57248 Mar 21 13:39:01.393941 sshd-session[6191]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:01.397844 systemd[1]: sshd@54-172.24.4.107:22-172.24.4.1:57248.service: Deactivated successfully. Mar 21 13:39:01.401100 systemd[1]: session-57.scope: Deactivated successfully. Mar 21 13:39:01.402898 systemd-logind[1460]: Session 57 logged out. Waiting for processes to exit. Mar 21 13:39:01.404280 systemd-logind[1460]: Removed session 57. Mar 21 13:39:05.305240 containerd[1480]: time="2025-03-21T13:39:05.305135765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"77379e5d7420a93e8d33fd8c5ea17bb82a1b8da881cae07bb1bfe3731990e6bf\" pid:6226 exited_at:{seconds:1742564345 nanos:304240185}" Mar 21 13:39:06.066723 containerd[1480]: time="2025-03-21T13:39:06.066618302Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"01797a3bf210bd18a8fba4bcd142c5619abe9dc89aba0bbe32af3a5f0c2f5541\" pid:6247 exited_at:{seconds:1742564346 nanos:66083051}" Mar 21 13:39:06.420617 systemd[1]: Started sshd@55-172.24.4.107:22-172.24.4.1:38562.service - OpenSSH per-connection server daemon (172.24.4.1:38562). Mar 21 13:39:07.637268 sshd[6266]: Accepted publickey for core from 172.24.4.1 port 38562 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:07.640402 sshd-session[6266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:07.653020 systemd-logind[1460]: New session 58 of user core. Mar 21 13:39:07.663825 systemd[1]: Started session-58.scope - Session 58 of User core. Mar 21 13:39:07.859168 containerd[1480]: time="2025-03-21T13:39:07.858863089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"9365d0a98c036a803accc955634044537085933c61895cb8054f9d7e59029221\" pid:6281 exited_at:{seconds:1742564347 nanos:858362644}" Mar 21 13:39:08.474964 sshd[6268]: Connection closed by 172.24.4.1 port 38562 Mar 21 13:39:08.475891 sshd-session[6266]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:08.484597 systemd[1]: sshd@55-172.24.4.107:22-172.24.4.1:38562.service: Deactivated successfully. Mar 21 13:39:08.490378 systemd[1]: session-58.scope: Deactivated successfully. Mar 21 13:39:08.493189 systemd-logind[1460]: Session 58 logged out. Waiting for processes to exit. Mar 21 13:39:08.496018 systemd-logind[1460]: Removed session 58. Mar 21 13:39:13.497385 systemd[1]: Started sshd@56-172.24.4.107:22-172.24.4.1:38570.service - OpenSSH per-connection server daemon (172.24.4.1:38570). Mar 21 13:39:14.958554 sshd[6303]: Accepted publickey for core from 172.24.4.1 port 38570 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:14.961333 sshd-session[6303]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:14.974297 systemd-logind[1460]: New session 59 of user core. Mar 21 13:39:14.979747 systemd[1]: Started session-59.scope - Session 59 of User core. Mar 21 13:39:15.776524 sshd[6305]: Connection closed by 172.24.4.1 port 38570 Mar 21 13:39:15.777665 sshd-session[6303]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:15.782659 systemd[1]: sshd@56-172.24.4.107:22-172.24.4.1:38570.service: Deactivated successfully. Mar 21 13:39:15.785197 systemd[1]: session-59.scope: Deactivated successfully. Mar 21 13:39:15.786408 systemd-logind[1460]: Session 59 logged out. Waiting for processes to exit. Mar 21 13:39:15.787391 systemd-logind[1460]: Removed session 59. Mar 21 13:39:20.799421 systemd[1]: Started sshd@57-172.24.4.107:22-172.24.4.1:49828.service - OpenSSH per-connection server daemon (172.24.4.1:49828). Mar 21 13:39:22.191294 sshd[6317]: Accepted publickey for core from 172.24.4.1 port 49828 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:22.194612 sshd-session[6317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:22.207627 systemd-logind[1460]: New session 60 of user core. Mar 21 13:39:22.214898 systemd[1]: Started session-60.scope - Session 60 of User core. Mar 21 13:39:23.064368 sshd[6319]: Connection closed by 172.24.4.1 port 49828 Mar 21 13:39:23.065020 sshd-session[6317]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:23.070637 systemd[1]: sshd@57-172.24.4.107:22-172.24.4.1:49828.service: Deactivated successfully. Mar 21 13:39:23.076281 systemd[1]: session-60.scope: Deactivated successfully. Mar 21 13:39:23.077603 systemd-logind[1460]: Session 60 logged out. Waiting for processes to exit. Mar 21 13:39:23.078971 systemd-logind[1460]: Removed session 60. Mar 21 13:39:28.090224 systemd[1]: Started sshd@58-172.24.4.107:22-172.24.4.1:54802.service - OpenSSH per-connection server daemon (172.24.4.1:54802). Mar 21 13:39:29.463707 sshd[6330]: Accepted publickey for core from 172.24.4.1 port 54802 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:29.466344 sshd-session[6330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:29.478547 systemd-logind[1460]: New session 61 of user core. Mar 21 13:39:29.491731 systemd[1]: Started session-61.scope - Session 61 of User core. Mar 21 13:39:30.226460 sshd[6332]: Connection closed by 172.24.4.1 port 54802 Mar 21 13:39:30.227625 sshd-session[6330]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:30.235062 systemd[1]: sshd@58-172.24.4.107:22-172.24.4.1:54802.service: Deactivated successfully. Mar 21 13:39:30.239767 systemd[1]: session-61.scope: Deactivated successfully. Mar 21 13:39:30.242127 systemd-logind[1460]: Session 61 logged out. Waiting for processes to exit. Mar 21 13:39:30.245593 systemd-logind[1460]: Removed session 61. Mar 21 13:39:35.250910 systemd[1]: Started sshd@59-172.24.4.107:22-172.24.4.1:53428.service - OpenSSH per-connection server daemon (172.24.4.1:53428). Mar 21 13:39:35.319179 containerd[1480]: time="2025-03-21T13:39:35.319134051Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"e6851349ed4590d877c67344a6ce93e7828aa629df7f8d295f2a06d04fc5d056\" pid:6358 exited_at:{seconds:1742564375 nanos:318880513}" Mar 21 13:39:36.062950 containerd[1480]: time="2025-03-21T13:39:36.062853277Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"70a3d456462dc3a5f6f3b3e5ee267b6ac46e852f685b1d8789fca1b3ca1e453f\" pid:6381 exited_at:{seconds:1742564376 nanos:62136867}" Mar 21 13:39:36.732715 sshd[6354]: Accepted publickey for core from 172.24.4.1 port 53428 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:36.736016 sshd-session[6354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:36.749863 systemd-logind[1460]: New session 62 of user core. Mar 21 13:39:36.758855 systemd[1]: Started session-62.scope - Session 62 of User core. Mar 21 13:39:37.489224 sshd[6394]: Connection closed by 172.24.4.1 port 53428 Mar 21 13:39:37.490301 sshd-session[6354]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:37.498243 systemd[1]: sshd@59-172.24.4.107:22-172.24.4.1:53428.service: Deactivated successfully. Mar 21 13:39:37.503056 systemd[1]: session-62.scope: Deactivated successfully. Mar 21 13:39:37.505081 systemd-logind[1460]: Session 62 logged out. Waiting for processes to exit. Mar 21 13:39:37.507382 systemd-logind[1460]: Removed session 62. Mar 21 13:39:42.512550 systemd[1]: Started sshd@60-172.24.4.107:22-172.24.4.1:53430.service - OpenSSH per-connection server daemon (172.24.4.1:53430). Mar 21 13:39:43.847825 sshd[6408]: Accepted publickey for core from 172.24.4.1 port 53430 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:43.851001 sshd-session[6408]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:43.866761 systemd-logind[1460]: New session 63 of user core. Mar 21 13:39:43.876796 systemd[1]: Started session-63.scope - Session 63 of User core. Mar 21 13:39:44.687165 sshd[6410]: Connection closed by 172.24.4.1 port 53430 Mar 21 13:39:44.688253 sshd-session[6408]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:44.692879 systemd[1]: sshd@60-172.24.4.107:22-172.24.4.1:53430.service: Deactivated successfully. Mar 21 13:39:44.694837 systemd[1]: session-63.scope: Deactivated successfully. Mar 21 13:39:44.697313 systemd-logind[1460]: Session 63 logged out. Waiting for processes to exit. Mar 21 13:39:44.699604 systemd-logind[1460]: Removed session 63. Mar 21 13:39:49.710987 systemd[1]: Started sshd@61-172.24.4.107:22-172.24.4.1:46914.service - OpenSSH per-connection server daemon (172.24.4.1:46914). Mar 21 13:39:50.821411 sshd[6421]: Accepted publickey for core from 172.24.4.1 port 46914 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:50.824144 sshd-session[6421]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:50.836300 systemd-logind[1460]: New session 64 of user core. Mar 21 13:39:50.843762 systemd[1]: Started session-64.scope - Session 64 of User core. Mar 21 13:39:51.562332 sshd[6423]: Connection closed by 172.24.4.1 port 46914 Mar 21 13:39:51.562903 sshd-session[6421]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:51.566378 systemd[1]: sshd@61-172.24.4.107:22-172.24.4.1:46914.service: Deactivated successfully. Mar 21 13:39:51.568639 systemd[1]: session-64.scope: Deactivated successfully. Mar 21 13:39:51.570192 systemd-logind[1460]: Session 64 logged out. Waiting for processes to exit. Mar 21 13:39:51.572621 systemd-logind[1460]: Removed session 64. Mar 21 13:39:53.326539 update_engine[1470]: I20250321 13:39:53.326375 1470 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 21 13:39:53.326539 update_engine[1470]: I20250321 13:39:53.326536 1470 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 21 13:39:53.327400 update_engine[1470]: I20250321 13:39:53.326853 1470 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 21 13:39:53.327735 update_engine[1470]: I20250321 13:39:53.327646 1470 omaha_request_params.cc:62] Current group set to developer Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328083 1470 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328116 1470 update_attempter.cc:643] Scheduling an action processor start. Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328145 1470 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328194 1470 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328299 1470 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328320 1470 omaha_request_action.cc:272] Request: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: Mar 21 13:39:53.328598 update_engine[1470]: I20250321 13:39:53.328331 1470 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 21 13:39:53.330409 locksmithd[1483]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 21 13:39:53.337005 update_engine[1470]: I20250321 13:39:53.336903 1470 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 21 13:39:53.338003 update_engine[1470]: I20250321 13:39:53.337888 1470 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 21 13:39:53.345857 update_engine[1470]: E20250321 13:39:53.345760 1470 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 21 13:39:53.345992 update_engine[1470]: I20250321 13:39:53.345899 1470 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 21 13:39:56.585884 systemd[1]: Started sshd@62-172.24.4.107:22-172.24.4.1:51970.service - OpenSSH per-connection server daemon (172.24.4.1:51970). Mar 21 13:39:57.627807 sshd[6438]: Accepted publickey for core from 172.24.4.1 port 51970 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:39:57.630896 sshd-session[6438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:39:57.642590 systemd-logind[1460]: New session 65 of user core. Mar 21 13:39:57.648921 systemd[1]: Started session-65.scope - Session 65 of User core. Mar 21 13:39:58.241246 sshd[6440]: Connection closed by 172.24.4.1 port 51970 Mar 21 13:39:58.242325 sshd-session[6438]: pam_unix(sshd:session): session closed for user core Mar 21 13:39:58.248858 systemd[1]: sshd@62-172.24.4.107:22-172.24.4.1:51970.service: Deactivated successfully. Mar 21 13:39:58.255009 systemd[1]: session-65.scope: Deactivated successfully. Mar 21 13:39:58.259346 systemd-logind[1460]: Session 65 logged out. Waiting for processes to exit. Mar 21 13:39:58.261832 systemd-logind[1460]: Removed session 65. Mar 21 13:40:03.266232 systemd[1]: Started sshd@63-172.24.4.107:22-172.24.4.1:51976.service - OpenSSH per-connection server daemon (172.24.4.1:51976). Mar 21 13:40:03.330540 update_engine[1470]: I20250321 13:40:03.330147 1470 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 21 13:40:03.331213 update_engine[1470]: I20250321 13:40:03.330557 1470 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 21 13:40:03.331213 update_engine[1470]: I20250321 13:40:03.330973 1470 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 21 13:40:03.336324 update_engine[1470]: E20250321 13:40:03.336253 1470 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 21 13:40:03.336493 update_engine[1470]: I20250321 13:40:03.336371 1470 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 21 13:40:04.518571 sshd[6452]: Accepted publickey for core from 172.24.4.1 port 51976 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:04.521646 sshd-session[6452]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:04.532751 systemd-logind[1460]: New session 66 of user core. Mar 21 13:40:04.543743 systemd[1]: Started session-66.scope - Session 66 of User core. Mar 21 13:40:05.298373 containerd[1480]: time="2025-03-21T13:40:05.298330240Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"5b23e7556078c1604a5d64ac650778df7723b595d0a094fd16dafc58977abaca\" pid:6475 exited_at:{seconds:1742564405 nanos:298141656}" Mar 21 13:40:05.395821 sshd[6454]: Connection closed by 172.24.4.1 port 51976 Mar 21 13:40:05.396196 sshd-session[6452]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:05.401564 systemd[1]: sshd@63-172.24.4.107:22-172.24.4.1:51976.service: Deactivated successfully. Mar 21 13:40:05.405170 systemd[1]: session-66.scope: Deactivated successfully. Mar 21 13:40:05.406697 systemd-logind[1460]: Session 66 logged out. Waiting for processes to exit. Mar 21 13:40:05.408013 systemd-logind[1460]: Removed session 66. Mar 21 13:40:06.072363 containerd[1480]: time="2025-03-21T13:40:06.072310346Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"152f8e4922d874b0e9200680acb13d5e01175fef91ce23567755e59c1fc32819\" pid:6501 exited_at:{seconds:1742564406 nanos:71252022}" Mar 21 13:40:07.850862 containerd[1480]: time="2025-03-21T13:40:07.850802465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"c52d92c6d3ea17902cfe57260a1420519a58527abdc2f76e4696e8eb7bddf9ea\" pid:6525 exited_at:{seconds:1742564407 nanos:850327750}" Mar 21 13:40:10.421076 systemd[1]: Started sshd@64-172.24.4.107:22-172.24.4.1:60898.service - OpenSSH per-connection server daemon (172.24.4.1:60898). Mar 21 13:40:11.592864 sshd[6535]: Accepted publickey for core from 172.24.4.1 port 60898 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:11.595669 sshd-session[6535]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:11.608247 systemd-logind[1460]: New session 67 of user core. Mar 21 13:40:11.613829 systemd[1]: Started session-67.scope - Session 67 of User core. Mar 21 13:40:12.340694 sshd[6539]: Connection closed by 172.24.4.1 port 60898 Mar 21 13:40:12.341858 sshd-session[6535]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:12.347238 systemd[1]: sshd@64-172.24.4.107:22-172.24.4.1:60898.service: Deactivated successfully. Mar 21 13:40:12.351582 systemd[1]: session-67.scope: Deactivated successfully. Mar 21 13:40:12.355136 systemd-logind[1460]: Session 67 logged out. Waiting for processes to exit. Mar 21 13:40:12.358735 systemd-logind[1460]: Removed session 67. Mar 21 13:40:13.326392 update_engine[1470]: I20250321 13:40:13.326035 1470 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 21 13:40:13.327610 update_engine[1470]: I20250321 13:40:13.327534 1470 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 21 13:40:13.328642 update_engine[1470]: I20250321 13:40:13.328480 1470 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 21 13:40:13.333919 update_engine[1470]: E20250321 13:40:13.333817 1470 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 21 13:40:13.334082 update_engine[1470]: I20250321 13:40:13.333997 1470 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 21 13:40:17.366477 systemd[1]: Started sshd@65-172.24.4.107:22-172.24.4.1:57780.service - OpenSSH per-connection server daemon (172.24.4.1:57780). Mar 21 13:40:18.488182 sshd[6551]: Accepted publickey for core from 172.24.4.1 port 57780 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:18.491112 sshd-session[6551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:18.503561 systemd-logind[1460]: New session 68 of user core. Mar 21 13:40:18.507756 systemd[1]: Started session-68.scope - Session 68 of User core. Mar 21 13:40:19.273252 sshd[6553]: Connection closed by 172.24.4.1 port 57780 Mar 21 13:40:19.273782 sshd-session[6551]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:19.282934 systemd[1]: sshd@65-172.24.4.107:22-172.24.4.1:57780.service: Deactivated successfully. Mar 21 13:40:19.287854 systemd[1]: session-68.scope: Deactivated successfully. Mar 21 13:40:19.289715 systemd-logind[1460]: Session 68 logged out. Waiting for processes to exit. Mar 21 13:40:19.291801 systemd-logind[1460]: Removed session 68. Mar 21 13:40:23.329294 update_engine[1470]: I20250321 13:40:23.329157 1470 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 21 13:40:23.330175 update_engine[1470]: I20250321 13:40:23.329670 1470 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 21 13:40:23.330175 update_engine[1470]: I20250321 13:40:23.330123 1470 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 21 13:40:23.335607 update_engine[1470]: E20250321 13:40:23.335532 1470 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 21 13:40:23.335763 update_engine[1470]: I20250321 13:40:23.335657 1470 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 21 13:40:23.335763 update_engine[1470]: I20250321 13:40:23.335682 1470 omaha_request_action.cc:617] Omaha request response: Mar 21 13:40:23.335992 update_engine[1470]: E20250321 13:40:23.335813 1470 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 21 13:40:23.335992 update_engine[1470]: I20250321 13:40:23.335850 1470 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 21 13:40:23.335992 update_engine[1470]: I20250321 13:40:23.335862 1470 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 21 13:40:23.335992 update_engine[1470]: I20250321 13:40:23.335873 1470 update_attempter.cc:306] Processing Done. Mar 21 13:40:23.335992 update_engine[1470]: E20250321 13:40:23.335896 1470 update_attempter.cc:619] Update failed. Mar 21 13:40:23.335992 update_engine[1470]: I20250321 13:40:23.335909 1470 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 21 13:40:23.335992 update_engine[1470]: I20250321 13:40:23.335920 1470 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 21 13:40:23.335992 update_engine[1470]: I20250321 13:40:23.335932 1470 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 21 13:40:23.336915 update_engine[1470]: I20250321 13:40:23.336072 1470 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 21 13:40:23.336915 update_engine[1470]: I20250321 13:40:23.336117 1470 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 21 13:40:23.336915 update_engine[1470]: I20250321 13:40:23.336129 1470 omaha_request_action.cc:272] Request: Mar 21 13:40:23.336915 update_engine[1470]: Mar 21 13:40:23.336915 update_engine[1470]: Mar 21 13:40:23.336915 update_engine[1470]: Mar 21 13:40:23.336915 update_engine[1470]: Mar 21 13:40:23.336915 update_engine[1470]: Mar 21 13:40:23.336915 update_engine[1470]: Mar 21 13:40:23.336915 update_engine[1470]: I20250321 13:40:23.336143 1470 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 21 13:40:23.336915 update_engine[1470]: I20250321 13:40:23.336421 1470 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 21 13:40:23.337706 update_engine[1470]: I20250321 13:40:23.336982 1470 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 21 13:40:23.337796 locksmithd[1483]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 21 13:40:23.342478 update_engine[1470]: E20250321 13:40:23.342362 1470 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 21 13:40:23.342581 update_engine[1470]: I20250321 13:40:23.342531 1470 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 21 13:40:23.342581 update_engine[1470]: I20250321 13:40:23.342557 1470 omaha_request_action.cc:617] Omaha request response: Mar 21 13:40:23.342581 update_engine[1470]: I20250321 13:40:23.342571 1470 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 21 13:40:23.343094 update_engine[1470]: I20250321 13:40:23.342582 1470 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 21 13:40:23.343094 update_engine[1470]: I20250321 13:40:23.342594 1470 update_attempter.cc:306] Processing Done. Mar 21 13:40:23.343094 update_engine[1470]: I20250321 13:40:23.342607 1470 update_attempter.cc:310] Error event sent. Mar 21 13:40:23.343094 update_engine[1470]: I20250321 13:40:23.342624 1470 update_check_scheduler.cc:74] Next update check in 46m58s Mar 21 13:40:23.343629 locksmithd[1483]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 21 13:40:24.299042 systemd[1]: Started sshd@66-172.24.4.107:22-172.24.4.1:43318.service - OpenSSH per-connection server daemon (172.24.4.1:43318). Mar 21 13:40:25.678337 sshd[6564]: Accepted publickey for core from 172.24.4.1 port 43318 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:25.681544 sshd-session[6564]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:25.693032 systemd-logind[1460]: New session 69 of user core. Mar 21 13:40:25.703774 systemd[1]: Started session-69.scope - Session 69 of User core. Mar 21 13:40:26.425529 sshd[6566]: Connection closed by 172.24.4.1 port 43318 Mar 21 13:40:26.426749 sshd-session[6564]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:26.433681 systemd[1]: sshd@66-172.24.4.107:22-172.24.4.1:43318.service: Deactivated successfully. Mar 21 13:40:26.438723 systemd[1]: session-69.scope: Deactivated successfully. Mar 21 13:40:26.441163 systemd-logind[1460]: Session 69 logged out. Waiting for processes to exit. Mar 21 13:40:26.444430 systemd-logind[1460]: Removed session 69. Mar 21 13:40:31.455977 systemd[1]: Started sshd@67-172.24.4.107:22-172.24.4.1:43320.service - OpenSSH per-connection server daemon (172.24.4.1:43320). Mar 21 13:40:32.612798 sshd[6578]: Accepted publickey for core from 172.24.4.1 port 43320 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:32.615551 sshd-session[6578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:32.628642 systemd-logind[1460]: New session 70 of user core. Mar 21 13:40:32.634766 systemd[1]: Started session-70.scope - Session 70 of User core. Mar 21 13:40:33.491066 sshd[6580]: Connection closed by 172.24.4.1 port 43320 Mar 21 13:40:33.493208 sshd-session[6578]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:33.500295 systemd[1]: sshd@67-172.24.4.107:22-172.24.4.1:43320.service: Deactivated successfully. Mar 21 13:40:33.504193 systemd[1]: session-70.scope: Deactivated successfully. Mar 21 13:40:33.506795 systemd-logind[1460]: Session 70 logged out. Waiting for processes to exit. Mar 21 13:40:33.510422 systemd-logind[1460]: Removed session 70. Mar 21 13:40:35.316043 containerd[1480]: time="2025-03-21T13:40:35.315979215Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"210f181c52baa91d4d7bd19b685620aae8dd5b7c0515f32f79e7b6edaad2cb4b\" pid:6615 exited_at:{seconds:1742564435 nanos:315605501}" Mar 21 13:40:36.074639 containerd[1480]: time="2025-03-21T13:40:36.074572313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"df7893037b25328571c175be21c37af6aa823fe588582319891837e4f95d95fd\" pid:6637 exited_at:{seconds:1742564436 nanos:73286141}" Mar 21 13:40:38.513431 systemd[1]: Started sshd@68-172.24.4.107:22-172.24.4.1:35444.service - OpenSSH per-connection server daemon (172.24.4.1:35444). Mar 21 13:40:39.741069 sshd[6655]: Accepted publickey for core from 172.24.4.1 port 35444 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:39.744153 sshd-session[6655]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:39.755810 systemd-logind[1460]: New session 71 of user core. Mar 21 13:40:39.766810 systemd[1]: Started session-71.scope - Session 71 of User core. Mar 21 13:40:40.447546 sshd[6657]: Connection closed by 172.24.4.1 port 35444 Mar 21 13:40:40.448616 sshd-session[6655]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:40.454642 systemd[1]: sshd@68-172.24.4.107:22-172.24.4.1:35444.service: Deactivated successfully. Mar 21 13:40:40.458880 systemd[1]: session-71.scope: Deactivated successfully. Mar 21 13:40:40.463895 systemd-logind[1460]: Session 71 logged out. Waiting for processes to exit. Mar 21 13:40:40.466352 systemd-logind[1460]: Removed session 71. Mar 21 13:40:45.471588 systemd[1]: Started sshd@69-172.24.4.107:22-172.24.4.1:52540.service - OpenSSH per-connection server daemon (172.24.4.1:52540). Mar 21 13:40:46.776681 sshd[6670]: Accepted publickey for core from 172.24.4.1 port 52540 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:46.780971 sshd-session[6670]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:46.795571 systemd-logind[1460]: New session 72 of user core. Mar 21 13:40:46.811814 systemd[1]: Started session-72.scope - Session 72 of User core. Mar 21 13:40:47.523633 sshd[6673]: Connection closed by 172.24.4.1 port 52540 Mar 21 13:40:47.524211 sshd-session[6670]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:47.528561 systemd-logind[1460]: Session 72 logged out. Waiting for processes to exit. Mar 21 13:40:47.528949 systemd[1]: sshd@69-172.24.4.107:22-172.24.4.1:52540.service: Deactivated successfully. Mar 21 13:40:47.531206 systemd[1]: session-72.scope: Deactivated successfully. Mar 21 13:40:47.533135 systemd-logind[1460]: Removed session 72. Mar 21 13:40:52.544557 systemd[1]: Started sshd@70-172.24.4.107:22-172.24.4.1:52544.service - OpenSSH per-connection server daemon (172.24.4.1:52544). Mar 21 13:40:53.752587 sshd[6685]: Accepted publickey for core from 172.24.4.1 port 52544 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:40:53.755625 sshd-session[6685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:40:53.768121 systemd-logind[1460]: New session 73 of user core. Mar 21 13:40:53.773195 systemd[1]: Started session-73.scope - Session 73 of User core. Mar 21 13:40:54.523680 sshd[6687]: Connection closed by 172.24.4.1 port 52544 Mar 21 13:40:54.525050 sshd-session[6685]: pam_unix(sshd:session): session closed for user core Mar 21 13:40:54.534268 systemd[1]: sshd@70-172.24.4.107:22-172.24.4.1:52544.service: Deactivated successfully. Mar 21 13:40:54.539887 systemd[1]: session-73.scope: Deactivated successfully. Mar 21 13:40:54.542242 systemd-logind[1460]: Session 73 logged out. Waiting for processes to exit. Mar 21 13:40:54.544681 systemd-logind[1460]: Removed session 73. Mar 21 13:40:59.543013 systemd[1]: Started sshd@71-172.24.4.107:22-172.24.4.1:54742.service - OpenSSH per-connection server daemon (172.24.4.1:54742). Mar 21 13:41:00.888298 sshd[6701]: Accepted publickey for core from 172.24.4.1 port 54742 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:00.891689 sshd-session[6701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:00.903900 systemd-logind[1460]: New session 74 of user core. Mar 21 13:41:00.909723 systemd[1]: Started session-74.scope - Session 74 of User core. Mar 21 13:41:01.770936 sshd[6703]: Connection closed by 172.24.4.1 port 54742 Mar 21 13:41:01.772059 sshd-session[6701]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:01.776987 systemd[1]: sshd@71-172.24.4.107:22-172.24.4.1:54742.service: Deactivated successfully. Mar 21 13:41:01.781953 systemd[1]: session-74.scope: Deactivated successfully. Mar 21 13:41:01.785151 systemd-logind[1460]: Session 74 logged out. Waiting for processes to exit. Mar 21 13:41:01.787301 systemd-logind[1460]: Removed session 74. Mar 21 13:41:05.315511 containerd[1480]: time="2025-03-21T13:41:05.315402951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"dbc13067713e46bebff1ee3920a66b194d312991ee41fd65c089babe8398f98c\" pid:6728 exited_at:{seconds:1742564465 nanos:315144274}" Mar 21 13:41:06.078767 containerd[1480]: time="2025-03-21T13:41:06.078702033Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"54b8217afc7b2fd0e51237f359af654fb156844ae99470a14388fd207a26bd29\" pid:6748 exited_at:{seconds:1742564466 nanos:77241744}" Mar 21 13:41:06.792639 systemd[1]: Started sshd@72-172.24.4.107:22-172.24.4.1:37374.service - OpenSSH per-connection server daemon (172.24.4.1:37374). Mar 21 13:41:07.868511 containerd[1480]: time="2025-03-21T13:41:07.868360917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"2f944cc16accdbb4b60ef65af052e18dcc4dc5fbb4981e0074c46f3cc7987187\" pid:6782 exited_at:{seconds:1742564467 nanos:868171400}" Mar 21 13:41:07.922941 sshd[6762]: Accepted publickey for core from 172.24.4.1 port 37374 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:07.924720 sshd-session[6762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:07.931562 systemd-logind[1460]: New session 75 of user core. Mar 21 13:41:07.934588 systemd[1]: Started session-75.scope - Session 75 of User core. Mar 21 13:41:08.631676 sshd[6791]: Connection closed by 172.24.4.1 port 37374 Mar 21 13:41:08.632699 sshd-session[6762]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:08.638226 systemd[1]: sshd@72-172.24.4.107:22-172.24.4.1:37374.service: Deactivated successfully. Mar 21 13:41:08.642914 systemd[1]: session-75.scope: Deactivated successfully. Mar 21 13:41:08.646962 systemd-logind[1460]: Session 75 logged out. Waiting for processes to exit. Mar 21 13:41:08.649703 systemd-logind[1460]: Removed session 75. Mar 21 13:41:13.654689 systemd[1]: Started sshd@73-172.24.4.107:22-172.24.4.1:42736.service - OpenSSH per-connection server daemon (172.24.4.1:42736). Mar 21 13:41:15.104669 sshd[6806]: Accepted publickey for core from 172.24.4.1 port 42736 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:15.106751 sshd-session[6806]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:15.118370 systemd-logind[1460]: New session 76 of user core. Mar 21 13:41:15.125758 systemd[1]: Started session-76.scope - Session 76 of User core. Mar 21 13:41:16.064134 sshd[6808]: Connection closed by 172.24.4.1 port 42736 Mar 21 13:41:16.064465 sshd-session[6806]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:16.068031 systemd-logind[1460]: Session 76 logged out. Waiting for processes to exit. Mar 21 13:41:16.068651 systemd[1]: sshd@73-172.24.4.107:22-172.24.4.1:42736.service: Deactivated successfully. Mar 21 13:41:16.070764 systemd[1]: session-76.scope: Deactivated successfully. Mar 21 13:41:16.072703 systemd-logind[1460]: Removed session 76. Mar 21 13:41:21.086688 systemd[1]: Started sshd@74-172.24.4.107:22-172.24.4.1:42748.service - OpenSSH per-connection server daemon (172.24.4.1:42748). Mar 21 13:41:22.453695 sshd[6819]: Accepted publickey for core from 172.24.4.1 port 42748 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:22.456609 sshd-session[6819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:22.469550 systemd-logind[1460]: New session 77 of user core. Mar 21 13:41:22.480583 systemd[1]: Started session-77.scope - Session 77 of User core. Mar 21 13:41:23.336111 sshd[6821]: Connection closed by 172.24.4.1 port 42748 Mar 21 13:41:23.336751 sshd-session[6819]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:23.341316 systemd[1]: sshd@74-172.24.4.107:22-172.24.4.1:42748.service: Deactivated successfully. Mar 21 13:41:23.345808 systemd[1]: session-77.scope: Deactivated successfully. Mar 21 13:41:23.347293 systemd-logind[1460]: Session 77 logged out. Waiting for processes to exit. Mar 21 13:41:23.349218 systemd-logind[1460]: Removed session 77. Mar 21 13:41:28.358037 systemd[1]: Started sshd@75-172.24.4.107:22-172.24.4.1:55092.service - OpenSSH per-connection server daemon (172.24.4.1:55092). Mar 21 13:41:29.933125 sshd[6833]: Accepted publickey for core from 172.24.4.1 port 55092 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:29.937335 sshd-session[6833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:29.949573 systemd-logind[1460]: New session 78 of user core. Mar 21 13:41:29.958756 systemd[1]: Started session-78.scope - Session 78 of User core. Mar 21 13:41:30.711751 sshd[6835]: Connection closed by 172.24.4.1 port 55092 Mar 21 13:41:30.712335 sshd-session[6833]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:30.716326 systemd[1]: sshd@75-172.24.4.107:22-172.24.4.1:55092.service: Deactivated successfully. Mar 21 13:41:30.718594 systemd[1]: session-78.scope: Deactivated successfully. Mar 21 13:41:30.720704 systemd-logind[1460]: Session 78 logged out. Waiting for processes to exit. Mar 21 13:41:30.722171 systemd-logind[1460]: Removed session 78. Mar 21 13:41:35.304007 containerd[1480]: time="2025-03-21T13:41:35.303919633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"96c33c33a07c82c3c640079e3ae322db7ef119b1c9eb2eea5081cfdcd8fad235\" pid:6857 exited_at:{seconds:1742564495 nanos:303681275}" Mar 21 13:41:35.732577 systemd[1]: Started sshd@76-172.24.4.107:22-172.24.4.1:41552.service - OpenSSH per-connection server daemon (172.24.4.1:41552). Mar 21 13:41:36.074794 containerd[1480]: time="2025-03-21T13:41:36.074694656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"aab1f19b282ecaa989242ccf74ec60a47bccc635169c18851dff8151bfc52218\" pid:6881 exited_at:{seconds:1742564496 nanos:74326273}" Mar 21 13:41:36.990283 sshd[6867]: Accepted publickey for core from 172.24.4.1 port 41552 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:36.993284 sshd-session[6867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:37.005931 systemd-logind[1460]: New session 79 of user core. Mar 21 13:41:37.017719 systemd[1]: Started session-79.scope - Session 79 of User core. Mar 21 13:41:37.698031 sshd[6893]: Connection closed by 172.24.4.1 port 41552 Mar 21 13:41:37.697600 sshd-session[6867]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:37.702039 systemd[1]: sshd@76-172.24.4.107:22-172.24.4.1:41552.service: Deactivated successfully. Mar 21 13:41:37.703750 systemd[1]: session-79.scope: Deactivated successfully. Mar 21 13:41:37.705981 systemd-logind[1460]: Session 79 logged out. Waiting for processes to exit. Mar 21 13:41:37.707400 systemd-logind[1460]: Removed session 79. Mar 21 13:41:42.720145 systemd[1]: Started sshd@77-172.24.4.107:22-172.24.4.1:41558.service - OpenSSH per-connection server daemon (172.24.4.1:41558). Mar 21 13:41:43.914537 sshd[6907]: Accepted publickey for core from 172.24.4.1 port 41558 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:43.917370 sshd-session[6907]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:43.929555 systemd-logind[1460]: New session 80 of user core. Mar 21 13:41:43.937737 systemd[1]: Started session-80.scope - Session 80 of User core. Mar 21 13:41:44.699522 sshd[6909]: Connection closed by 172.24.4.1 port 41558 Mar 21 13:41:44.700630 sshd-session[6907]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:44.707584 systemd[1]: sshd@77-172.24.4.107:22-172.24.4.1:41558.service: Deactivated successfully. Mar 21 13:41:44.711242 systemd[1]: session-80.scope: Deactivated successfully. Mar 21 13:41:44.713976 systemd-logind[1460]: Session 80 logged out. Waiting for processes to exit. Mar 21 13:41:44.716346 systemd-logind[1460]: Removed session 80. Mar 21 13:41:49.722870 systemd[1]: Started sshd@78-172.24.4.107:22-172.24.4.1:44438.service - OpenSSH per-connection server daemon (172.24.4.1:44438). Mar 21 13:41:51.015232 sshd[6921]: Accepted publickey for core from 172.24.4.1 port 44438 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:51.018052 sshd-session[6921]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:51.029745 systemd-logind[1460]: New session 81 of user core. Mar 21 13:41:51.036756 systemd[1]: Started session-81.scope - Session 81 of User core. Mar 21 13:41:51.721687 sshd[6923]: Connection closed by 172.24.4.1 port 44438 Mar 21 13:41:51.722214 sshd-session[6921]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:51.726510 systemd[1]: sshd@78-172.24.4.107:22-172.24.4.1:44438.service: Deactivated successfully. Mar 21 13:41:51.729577 systemd[1]: session-81.scope: Deactivated successfully. Mar 21 13:41:51.731345 systemd-logind[1460]: Session 81 logged out. Waiting for processes to exit. Mar 21 13:41:51.733151 systemd-logind[1460]: Removed session 81. Mar 21 13:41:56.743591 systemd[1]: Started sshd@79-172.24.4.107:22-172.24.4.1:39318.service - OpenSSH per-connection server daemon (172.24.4.1:39318). Mar 21 13:41:57.912980 sshd[6937]: Accepted publickey for core from 172.24.4.1 port 39318 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:41:57.915764 sshd-session[6937]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:41:57.926082 systemd-logind[1460]: New session 82 of user core. Mar 21 13:41:57.934763 systemd[1]: Started session-82.scope - Session 82 of User core. Mar 21 13:41:58.490406 sshd[6939]: Connection closed by 172.24.4.1 port 39318 Mar 21 13:41:58.491284 sshd-session[6937]: pam_unix(sshd:session): session closed for user core Mar 21 13:41:58.495255 systemd[1]: sshd@79-172.24.4.107:22-172.24.4.1:39318.service: Deactivated successfully. Mar 21 13:41:58.497267 systemd[1]: session-82.scope: Deactivated successfully. Mar 21 13:41:58.498244 systemd-logind[1460]: Session 82 logged out. Waiting for processes to exit. Mar 21 13:41:58.499257 systemd-logind[1460]: Removed session 82. Mar 21 13:42:03.512949 systemd[1]: Started sshd@80-172.24.4.107:22-172.24.4.1:35506.service - OpenSSH per-connection server daemon (172.24.4.1:35506). Mar 21 13:42:04.721122 sshd[6951]: Accepted publickey for core from 172.24.4.1 port 35506 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:42:04.723847 sshd-session[6951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:42:04.735562 systemd-logind[1460]: New session 83 of user core. Mar 21 13:42:04.745732 systemd[1]: Started session-83.scope - Session 83 of User core. Mar 21 13:42:05.299210 containerd[1480]: time="2025-03-21T13:42:05.299141073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"d8ec4ba182cb36c0e71ecb355a8887b955b483f20ee729bf245df7c7a3fcf683\" pid:6974 exited_at:{seconds:1742564525 nanos:298695586}" Mar 21 13:42:05.419011 sshd[6953]: Connection closed by 172.24.4.1 port 35506 Mar 21 13:42:05.419540 sshd-session[6951]: pam_unix(sshd:session): session closed for user core Mar 21 13:42:05.423343 systemd[1]: sshd@80-172.24.4.107:22-172.24.4.1:35506.service: Deactivated successfully. Mar 21 13:42:05.425744 systemd[1]: session-83.scope: Deactivated successfully. Mar 21 13:42:05.429507 systemd-logind[1460]: Session 83 logged out. Waiting for processes to exit. Mar 21 13:42:05.431813 systemd-logind[1460]: Removed session 83. Mar 21 13:42:06.027559 containerd[1480]: time="2025-03-21T13:42:06.027414901Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"02f69ff4cd3a5d36e6ef74213c333c20be5697a46ac8059aa447c7b1a4a5af30\" pid:6999 exited_at:{seconds:1742564526 nanos:26920000}" Mar 21 13:42:07.856798 containerd[1480]: time="2025-03-21T13:42:07.856696239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"1f6d3fe272b4b92c80f35bac0f02b9ec0d84983f32c0878a005db845d5f4302c\" pid:7023 exited_at:{seconds:1742564527 nanos:855647055}" Mar 21 13:42:10.443081 systemd[1]: Started sshd@81-172.24.4.107:22-172.24.4.1:35520.service - OpenSSH per-connection server daemon (172.24.4.1:35520). Mar 21 13:42:11.552974 sshd[7033]: Accepted publickey for core from 172.24.4.1 port 35520 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:42:11.556496 sshd-session[7033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:42:11.570551 systemd-logind[1460]: New session 84 of user core. Mar 21 13:42:11.576868 systemd[1]: Started session-84.scope - Session 84 of User core. Mar 21 13:42:12.259063 sshd[7037]: Connection closed by 172.24.4.1 port 35520 Mar 21 13:42:12.260124 sshd-session[7033]: pam_unix(sshd:session): session closed for user core Mar 21 13:42:12.266099 systemd[1]: sshd@81-172.24.4.107:22-172.24.4.1:35520.service: Deactivated successfully. Mar 21 13:42:12.270293 systemd[1]: session-84.scope: Deactivated successfully. Mar 21 13:42:12.272012 systemd-logind[1460]: Session 84 logged out. Waiting for processes to exit. Mar 21 13:42:12.274200 systemd-logind[1460]: Removed session 84. Mar 21 13:42:17.284171 systemd[1]: Started sshd@82-172.24.4.107:22-172.24.4.1:49846.service - OpenSSH per-connection server daemon (172.24.4.1:49846). Mar 21 13:42:18.545556 sshd[7065]: Accepted publickey for core from 172.24.4.1 port 49846 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:42:18.548383 sshd-session[7065]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:42:18.561105 systemd-logind[1460]: New session 85 of user core. Mar 21 13:42:18.565723 systemd[1]: Started session-85.scope - Session 85 of User core. Mar 21 13:42:19.306084 sshd[7067]: Connection closed by 172.24.4.1 port 49846 Mar 21 13:42:19.308147 sshd-session[7065]: pam_unix(sshd:session): session closed for user core Mar 21 13:42:19.314232 systemd[1]: sshd@82-172.24.4.107:22-172.24.4.1:49846.service: Deactivated successfully. Mar 21 13:42:19.319253 systemd[1]: session-85.scope: Deactivated successfully. Mar 21 13:42:19.328326 systemd-logind[1460]: Session 85 logged out. Waiting for processes to exit. Mar 21 13:42:19.331129 systemd-logind[1460]: Removed session 85. Mar 21 13:42:24.326975 systemd[1]: Started sshd@83-172.24.4.107:22-172.24.4.1:59318.service - OpenSSH per-connection server daemon (172.24.4.1:59318). Mar 21 13:42:25.482925 sshd[7079]: Accepted publickey for core from 172.24.4.1 port 59318 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:42:25.486121 sshd-session[7079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:42:25.496854 systemd-logind[1460]: New session 86 of user core. Mar 21 13:42:25.503765 systemd[1]: Started session-86.scope - Session 86 of User core. Mar 21 13:42:26.401676 sshd[7081]: Connection closed by 172.24.4.1 port 59318 Mar 21 13:42:26.402904 sshd-session[7079]: pam_unix(sshd:session): session closed for user core Mar 21 13:42:26.412960 systemd-logind[1460]: Session 86 logged out. Waiting for processes to exit. Mar 21 13:42:26.414341 systemd[1]: sshd@83-172.24.4.107:22-172.24.4.1:59318.service: Deactivated successfully. Mar 21 13:42:26.421560 systemd[1]: session-86.scope: Deactivated successfully. Mar 21 13:42:26.427876 systemd-logind[1460]: Removed session 86. Mar 21 13:42:31.419903 systemd[1]: Started sshd@84-172.24.4.107:22-172.24.4.1:59330.service - OpenSSH per-connection server daemon (172.24.4.1:59330). Mar 21 13:42:32.626536 sshd[7093]: Accepted publickey for core from 172.24.4.1 port 59330 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:42:32.629424 sshd-session[7093]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:42:32.642128 systemd-logind[1460]: New session 87 of user core. Mar 21 13:42:32.649747 systemd[1]: Started session-87.scope - Session 87 of User core. Mar 21 13:42:33.605540 sshd[7095]: Connection closed by 172.24.4.1 port 59330 Mar 21 13:42:33.606626 sshd-session[7093]: pam_unix(sshd:session): session closed for user core Mar 21 13:42:33.614190 systemd-logind[1460]: Session 87 logged out. Waiting for processes to exit. Mar 21 13:42:33.616407 systemd[1]: sshd@84-172.24.4.107:22-172.24.4.1:59330.service: Deactivated successfully. Mar 21 13:42:33.620868 systemd[1]: session-87.scope: Deactivated successfully. Mar 21 13:42:33.624400 systemd-logind[1460]: Removed session 87. Mar 21 13:42:35.308411 containerd[1480]: time="2025-03-21T13:42:35.308369076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a06d8316721ae2188f6e781234a25004bcbb98152a6449e258eb8b5136aa24b\" id:\"c876d10cff475f8bfe4d124883381fa963bae476334680d26fa06218014a5722\" pid:7119 exited_at:{seconds:1742564555 nanos:307960196}" Mar 21 13:42:36.075687 containerd[1480]: time="2025-03-21T13:42:36.075644845Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d54a485f491dc391b09326744310fbfd6e67d26e2bff0c28d8f9f04aa0e655a\" id:\"71b56e64ef43744b9094ad836cbe4c5a99098f88907541f9d6b8dac4b30ab61f\" pid:7140 exited_at:{seconds:1742564556 nanos:75164381}" Mar 21 13:42:38.630167 systemd[1]: Started sshd@85-172.24.4.107:22-172.24.4.1:49180.service - OpenSSH per-connection server daemon (172.24.4.1:49180). Mar 21 13:42:39.739855 sshd[7153]: Accepted publickey for core from 172.24.4.1 port 49180 ssh2: RSA SHA256:ANmj2OjS2Xp1ZpeGOmqKkJesIrogOd1e6RUrzk4ButI Mar 21 13:42:39.742654 sshd-session[7153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 21 13:42:39.753371 systemd-logind[1460]: New session 88 of user core. Mar 21 13:42:39.760712 systemd[1]: Started session-88.scope - Session 88 of User core. Mar 21 13:42:40.493585 sshd[7155]: Connection closed by 172.24.4.1 port 49180 Mar 21 13:42:40.494360 sshd-session[7153]: pam_unix(sshd:session): session closed for user core Mar 21 13:42:40.504049 systemd[1]: sshd@85-172.24.4.107:22-172.24.4.1:49180.service: Deactivated successfully. Mar 21 13:42:40.509386 systemd[1]: session-88.scope: Deactivated successfully. Mar 21 13:42:40.511668 systemd-logind[1460]: Session 88 logged out. Waiting for processes to exit. Mar 21 13:42:40.513867 systemd-logind[1460]: Removed session 88.