Mar 25 01:35:13.102188 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 24 23:38:35 -00 2025 Mar 25 01:35:13.102256 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:35:13.102273 kernel: BIOS-provided physical RAM map: Mar 25 01:35:13.102284 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 25 01:35:13.102292 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Mar 25 01:35:13.102302 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable Mar 25 01:35:13.102314 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc8fff] reserved Mar 25 01:35:13.102325 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Mar 25 01:35:13.102338 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Mar 25 01:35:13.102349 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Mar 25 01:35:13.102360 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Mar 25 01:35:13.102370 kernel: printk: bootconsole [earlyser0] enabled Mar 25 01:35:13.102380 kernel: NX (Execute Disable) protection: active Mar 25 01:35:13.102392 kernel: APIC: Static calls initialized Mar 25 01:35:13.102407 kernel: efi: EFI v2.7 by Microsoft Mar 25 01:35:13.102420 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3ee83a98 RNG=0x3ffd1018 Mar 25 01:35:13.102432 kernel: random: crng init done Mar 25 01:35:13.102444 kernel: secureboot: Secure boot disabled Mar 25 01:35:13.102455 kernel: SMBIOS 3.1.0 present. Mar 25 01:35:13.102467 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 03/08/2024 Mar 25 01:35:13.102479 kernel: Hypervisor detected: Microsoft Hyper-V Mar 25 01:35:13.102491 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x64e24, misc 0xbed7b2 Mar 25 01:35:13.102503 kernel: Hyper-V: Host Build 10.0.20348.1799-1-0 Mar 25 01:35:13.102514 kernel: Hyper-V: Nested features: 0x1e0101 Mar 25 01:35:13.102526 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Mar 25 01:35:13.102540 kernel: Hyper-V: Using hypercall for remote TLB flush Mar 25 01:35:13.102552 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Mar 25 01:35:13.102564 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Mar 25 01:35:13.102576 kernel: tsc: Marking TSC unstable due to running on Hyper-V Mar 25 01:35:13.102588 kernel: tsc: Detected 2593.907 MHz processor Mar 25 01:35:13.102600 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 25 01:35:13.102613 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 25 01:35:13.102624 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x400000000 Mar 25 01:35:13.102637 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Mar 25 01:35:13.102651 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 25 01:35:13.102663 kernel: e820: update [mem 0x40000000-0xffffffff] usable ==> reserved Mar 25 01:35:13.102675 kernel: last_pfn = 0x40000 max_arch_pfn = 0x400000000 Mar 25 01:35:13.102687 kernel: Using GB pages for direct mapping Mar 25 01:35:13.102699 kernel: ACPI: Early table checksum verification disabled Mar 25 01:35:13.102711 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Mar 25 01:35:13.102729 kernel: ACPI: XSDT 0x000000003FFF90E8 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102744 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102757 kernel: ACPI: DSDT 0x000000003FFD6000 01E184 (v02 MSFTVM DSDT01 00000001 MSFT 05000000) Mar 25 01:35:13.102769 kernel: ACPI: FACS 0x000000003FFFE000 000040 Mar 25 01:35:13.102782 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102795 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102808 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102821 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102836 kernel: ACPI: SRAT 0x000000003FFD4000 0002D0 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102849 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102862 kernel: ACPI: FPDT 0x000000003FFD2000 000034 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 25 01:35:13.102875 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Mar 25 01:35:13.102887 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4183] Mar 25 01:35:13.102900 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Mar 25 01:35:13.102913 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Mar 25 01:35:13.102926 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Mar 25 01:35:13.102941 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Mar 25 01:35:13.102954 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] Mar 25 01:35:13.102967 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd42cf] Mar 25 01:35:13.102980 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Mar 25 01:35:13.102992 kernel: ACPI: Reserving FPDT table memory at [mem 0x3ffd2000-0x3ffd2033] Mar 25 01:35:13.103005 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 25 01:35:13.103017 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 25 01:35:13.103029 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 25 01:35:13.103041 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] hotplug Mar 25 01:35:13.103057 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2c0000000-0xfdfffffff] hotplug Mar 25 01:35:13.103069 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 25 01:35:13.103082 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 25 01:35:13.103107 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 25 01:35:13.103121 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 25 01:35:13.103132 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 25 01:35:13.103144 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 25 01:35:13.103156 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 25 01:35:13.103169 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 25 01:35:13.103185 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 25 01:35:13.103243 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000000-0x1ffffffffffff] hotplug Mar 25 01:35:13.103258 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2000000000000-0x3ffffffffffff] hotplug Mar 25 01:35:13.103283 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x4000000000000-0x7ffffffffffff] hotplug Mar 25 01:35:13.103296 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x8000000000000-0xfffffffffffff] hotplug Mar 25 01:35:13.103309 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00000000-0x2bfffffff] Mar 25 01:35:13.103322 kernel: NODE_DATA(0) allocated [mem 0x2bfffa000-0x2bfffffff] Mar 25 01:35:13.103335 kernel: Zone ranges: Mar 25 01:35:13.103349 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 25 01:35:13.103366 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 25 01:35:13.103379 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Mar 25 01:35:13.103391 kernel: Movable zone start for each node Mar 25 01:35:13.103404 kernel: Early memory node ranges Mar 25 01:35:13.103417 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 25 01:35:13.103431 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] Mar 25 01:35:13.103443 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Mar 25 01:35:13.103456 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Mar 25 01:35:13.103468 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Mar 25 01:35:13.103484 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 25 01:35:13.103497 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 25 01:35:13.103509 kernel: On node 0, zone DMA32: 190 pages in unavailable ranges Mar 25 01:35:13.103522 kernel: ACPI: PM-Timer IO Port: 0x408 Mar 25 01:35:13.103535 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) Mar 25 01:35:13.103547 kernel: IOAPIC[0]: apic_id 2, version 17, address 0xfec00000, GSI 0-23 Mar 25 01:35:13.103560 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 25 01:35:13.103573 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 25 01:35:13.103585 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Mar 25 01:35:13.103600 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 25 01:35:13.103613 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Mar 25 01:35:13.103625 kernel: Booting paravirtualized kernel on Hyper-V Mar 25 01:35:13.103638 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 25 01:35:13.103650 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 25 01:35:13.103663 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 25 01:35:13.103692 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 25 01:35:13.103704 kernel: pcpu-alloc: [0] 0 1 Mar 25 01:35:13.103717 kernel: Hyper-V: PV spinlocks enabled Mar 25 01:35:13.103733 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 25 01:35:13.103748 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:35:13.103761 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:35:13.103774 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 25 01:35:13.103787 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:35:13.103800 kernel: Fallback order for Node 0: 0 Mar 25 01:35:13.103813 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2062618 Mar 25 01:35:13.103826 kernel: Policy zone: Normal Mar 25 01:35:13.103851 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:35:13.103865 kernel: software IO TLB: area num 2. Mar 25 01:35:13.103881 kernel: Memory: 8072992K/8387460K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 314212K reserved, 0K cma-reserved) Mar 25 01:35:13.103895 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 25 01:35:13.103908 kernel: ftrace: allocating 37985 entries in 149 pages Mar 25 01:35:13.103922 kernel: ftrace: allocated 149 pages with 4 groups Mar 25 01:35:13.103935 kernel: Dynamic Preempt: voluntary Mar 25 01:35:13.103949 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:35:13.103963 kernel: rcu: RCU event tracing is enabled. Mar 25 01:35:13.103977 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 25 01:35:13.103993 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:35:13.104007 kernel: Rude variant of Tasks RCU enabled. Mar 25 01:35:13.104020 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:35:13.104034 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:35:13.104048 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 25 01:35:13.104061 kernel: Using NULL legacy PIC Mar 25 01:35:13.104077 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Mar 25 01:35:13.104091 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:35:13.104104 kernel: Console: colour dummy device 80x25 Mar 25 01:35:13.104118 kernel: printk: console [tty1] enabled Mar 25 01:35:13.104131 kernel: printk: console [ttyS0] enabled Mar 25 01:35:13.104144 kernel: printk: bootconsole [earlyser0] disabled Mar 25 01:35:13.104158 kernel: ACPI: Core revision 20230628 Mar 25 01:35:13.104171 kernel: Failed to register legacy timer interrupt Mar 25 01:35:13.104184 kernel: APIC: Switch to symmetric I/O mode setup Mar 25 01:35:13.104218 kernel: Hyper-V: enabling crash_kexec_post_notifiers Mar 25 01:35:13.104235 kernel: Hyper-V: Using IPI hypercalls Mar 25 01:35:13.104247 kernel: APIC: send_IPI() replaced with hv_send_ipi() Mar 25 01:35:13.104260 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() Mar 25 01:35:13.104273 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() Mar 25 01:35:13.104283 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() Mar 25 01:35:13.104291 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() Mar 25 01:35:13.107213 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() Mar 25 01:35:13.107235 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 5187.81 BogoMIPS (lpj=2593907) Mar 25 01:35:13.107255 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 25 01:35:13.107267 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 25 01:35:13.107279 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 25 01:35:13.107291 kernel: Spectre V2 : Mitigation: Retpolines Mar 25 01:35:13.107304 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 25 01:35:13.107316 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 25 01:35:13.107329 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Mar 25 01:35:13.107344 kernel: RETBleed: Vulnerable Mar 25 01:35:13.107357 kernel: Speculative Store Bypass: Vulnerable Mar 25 01:35:13.107372 kernel: TAA: Vulnerable: Clear CPU buffers attempted, no microcode Mar 25 01:35:13.107386 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 25 01:35:13.107403 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 25 01:35:13.107416 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 25 01:35:13.107430 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 25 01:35:13.107444 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 25 01:35:13.107456 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 25 01:35:13.107469 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 25 01:35:13.107517 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 25 01:35:13.107528 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 25 01:35:13.107537 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 25 01:35:13.107549 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 25 01:35:13.107557 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. Mar 25 01:35:13.107573 kernel: Freeing SMP alternatives memory: 32K Mar 25 01:35:13.107581 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:35:13.107589 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:35:13.107600 kernel: landlock: Up and running. Mar 25 01:35:13.107608 kernel: SELinux: Initializing. Mar 25 01:35:13.107618 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 25 01:35:13.107627 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 25 01:35:13.107635 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz (family: 0x6, model: 0x55, stepping: 0x7) Mar 25 01:35:13.107647 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:35:13.107656 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:35:13.107668 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:35:13.107677 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Mar 25 01:35:13.107685 kernel: signal: max sigframe size: 3632 Mar 25 01:35:13.107697 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:35:13.107706 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:35:13.107715 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 25 01:35:13.107725 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:35:13.107733 kernel: smpboot: x86: Booting SMP configuration: Mar 25 01:35:13.107744 kernel: .... node #0, CPUs: #1 Mar 25 01:35:13.107753 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Mar 25 01:35:13.107765 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Mar 25 01:35:13.107773 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:35:13.107784 kernel: smpboot: Max logical packages: 1 Mar 25 01:35:13.107793 kernel: smpboot: Total of 2 processors activated (10375.62 BogoMIPS) Mar 25 01:35:13.107801 kernel: devtmpfs: initialized Mar 25 01:35:13.107812 kernel: x86/mm: Memory block size: 128MB Mar 25 01:35:13.107821 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Mar 25 01:35:13.107829 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:35:13.107842 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 25 01:35:13.107850 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:35:13.107858 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:35:13.107866 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:35:13.107875 kernel: audit: type=2000 audit(1742866511.028:1): state=initialized audit_enabled=0 res=1 Mar 25 01:35:13.107882 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:35:13.107890 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 25 01:35:13.107898 kernel: cpuidle: using governor menu Mar 25 01:35:13.107906 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:35:13.107917 kernel: dca service started, version 1.12.1 Mar 25 01:35:13.107925 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] Mar 25 01:35:13.107933 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 25 01:35:13.107944 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:35:13.107952 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:35:13.107960 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:35:13.107971 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:35:13.107979 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:35:13.107987 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:35:13.108000 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:35:13.108008 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:35:13.108017 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:35:13.108027 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 25 01:35:13.108035 kernel: ACPI: Interpreter enabled Mar 25 01:35:13.108043 kernel: ACPI: PM: (supports S0 S5) Mar 25 01:35:13.108054 kernel: ACPI: Using IOAPIC for interrupt routing Mar 25 01:35:13.108062 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 25 01:35:13.108074 kernel: PCI: Ignoring E820 reservations for host bridge windows Mar 25 01:35:13.108084 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Mar 25 01:35:13.108096 kernel: iommu: Default domain type: Translated Mar 25 01:35:13.108104 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 25 01:35:13.108112 kernel: efivars: Registered efivars operations Mar 25 01:35:13.108123 kernel: PCI: Using ACPI for IRQ routing Mar 25 01:35:13.108131 kernel: PCI: System does not support PCI Mar 25 01:35:13.108141 kernel: vgaarb: loaded Mar 25 01:35:13.108150 kernel: clocksource: Switched to clocksource hyperv_clocksource_tsc_page Mar 25 01:35:13.108158 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:35:13.108172 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:35:13.108180 kernel: pnp: PnP ACPI init Mar 25 01:35:13.108188 kernel: pnp: PnP ACPI: found 3 devices Mar 25 01:35:13.108211 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 25 01:35:13.108222 kernel: NET: Registered PF_INET protocol family Mar 25 01:35:13.108231 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 25 01:35:13.108240 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Mar 25 01:35:13.108250 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:35:13.108258 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:35:13.108272 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 25 01:35:13.108281 kernel: TCP: Hash tables configured (established 65536 bind 65536) Mar 25 01:35:13.108289 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 25 01:35:13.108300 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 25 01:35:13.108308 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:35:13.108318 kernel: NET: Registered PF_XDP protocol family Mar 25 01:35:13.108328 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:35:13.108336 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 25 01:35:13.108348 kernel: software IO TLB: mapped [mem 0x000000003ae83000-0x000000003ee83000] (64MB) Mar 25 01:35:13.108359 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 25 01:35:13.108368 kernel: Initialise system trusted keyrings Mar 25 01:35:13.108378 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Mar 25 01:35:13.108386 kernel: Key type asymmetric registered Mar 25 01:35:13.108396 kernel: Asymmetric key parser 'x509' registered Mar 25 01:35:13.108405 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 25 01:35:13.108413 kernel: io scheduler mq-deadline registered Mar 25 01:35:13.108424 kernel: io scheduler kyber registered Mar 25 01:35:13.108432 kernel: io scheduler bfq registered Mar 25 01:35:13.108444 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 25 01:35:13.108454 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:35:13.108462 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 25 01:35:13.108473 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Mar 25 01:35:13.108483 kernel: i8042: PNP: No PS/2 controller found. Mar 25 01:35:13.108630 kernel: rtc_cmos 00:02: registered as rtc0 Mar 25 01:35:13.108727 kernel: rtc_cmos 00:02: setting system clock to 2025-03-25T01:35:12 UTC (1742866512) Mar 25 01:35:13.108821 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Mar 25 01:35:13.108832 kernel: intel_pstate: CPU model not supported Mar 25 01:35:13.108843 kernel: efifb: probing for efifb Mar 25 01:35:13.108852 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 25 01:35:13.108861 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 25 01:35:13.108872 kernel: efifb: scrolling: redraw Mar 25 01:35:13.108881 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 25 01:35:13.108890 kernel: Console: switching to colour frame buffer device 128x48 Mar 25 01:35:13.108900 kernel: fb0: EFI VGA frame buffer device Mar 25 01:35:13.108915 kernel: pstore: Using crash dump compression: deflate Mar 25 01:35:13.108923 kernel: pstore: Registered efi_pstore as persistent store backend Mar 25 01:35:13.108932 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:35:13.108943 kernel: Segment Routing with IPv6 Mar 25 01:35:13.108951 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:35:13.108961 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:35:13.108970 kernel: Key type dns_resolver registered Mar 25 01:35:13.108978 kernel: IPI shorthand broadcast: enabled Mar 25 01:35:13.108989 kernel: sched_clock: Marking stable (832002900, 49150600)->(1097129800, -215976300) Mar 25 01:35:13.109000 kernel: registered taskstats version 1 Mar 25 01:35:13.109009 kernel: Loading compiled-in X.509 certificates Mar 25 01:35:13.109020 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: eff01054e94a599f8e404b9a9482f4e2220f5386' Mar 25 01:35:13.109028 kernel: Key type .fscrypt registered Mar 25 01:35:13.109038 kernel: Key type fscrypt-provisioning registered Mar 25 01:35:13.109048 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:35:13.109056 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:35:13.109068 kernel: ima: No architecture policies found Mar 25 01:35:13.109076 kernel: clk: Disabling unused clocks Mar 25 01:35:13.109088 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 25 01:35:13.109097 kernel: Write protecting the kernel read-only data: 40960k Mar 25 01:35:13.109106 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 25 01:35:13.109117 kernel: Run /init as init process Mar 25 01:35:13.109125 kernel: with arguments: Mar 25 01:35:13.109133 kernel: /init Mar 25 01:35:13.109144 kernel: with environment: Mar 25 01:35:13.109152 kernel: HOME=/ Mar 25 01:35:13.109160 kernel: TERM=linux Mar 25 01:35:13.109171 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:35:13.109182 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:35:13.109212 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:35:13.109224 systemd[1]: Detected virtualization microsoft. Mar 25 01:35:13.109232 systemd[1]: Detected architecture x86-64. Mar 25 01:35:13.109243 systemd[1]: Running in initrd. Mar 25 01:35:13.109251 systemd[1]: No hostname configured, using default hostname. Mar 25 01:35:13.109265 systemd[1]: Hostname set to <localhost>. Mar 25 01:35:13.109275 systemd[1]: Initializing machine ID from random generator. Mar 25 01:35:13.109283 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:35:13.109295 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:35:13.109303 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:35:13.109313 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:35:13.109324 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:35:13.109335 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:35:13.109347 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:35:13.109359 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:35:13.109370 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:35:13.109378 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:35:13.109390 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:35:13.109400 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:35:13.109411 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:35:13.109421 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:35:13.109432 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:35:13.109440 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:35:13.109449 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:35:13.109457 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:35:13.109466 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:35:13.109474 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:35:13.109483 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:35:13.109491 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:35:13.109502 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:35:13.109510 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:35:13.109519 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:35:13.109531 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:35:13.109539 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:35:13.109549 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:35:13.109559 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:35:13.109568 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:35:13.109597 systemd-journald[177]: Collecting audit messages is disabled. Mar 25 01:35:13.109622 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:35:13.109634 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:35:13.109647 systemd-journald[177]: Journal started Mar 25 01:35:13.109673 systemd-journald[177]: Runtime Journal (/run/log/journal/b3b0901a958241be865922108f2d4af2) is 8M, max 158.7M, 150.7M free. Mar 25 01:35:13.101622 systemd-modules-load[178]: Inserted module 'overlay' Mar 25 01:35:13.123835 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:35:13.124595 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:35:13.129405 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:13.138651 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:35:13.151728 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:35:13.154508 kernel: Bridge firewalling registered Mar 25 01:35:13.153916 systemd-modules-load[178]: Inserted module 'br_netfilter' Mar 25 01:35:13.159340 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:35:13.166835 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:35:13.179667 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:35:13.180006 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:35:13.185554 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:35:13.187253 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:35:13.195686 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:35:13.223094 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:35:13.226759 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:35:13.231686 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:35:13.244242 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:35:13.251298 dracut-cmdline[208]: dracut-dracut-053 Mar 25 01:35:13.251298 dracut-cmdline[208]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:35:13.255304 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:35:13.316084 systemd-resolved[226]: Positive Trust Anchors: Mar 25 01:35:13.316098 systemd-resolved[226]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:35:13.316160 systemd-resolved[226]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:35:13.319972 systemd-resolved[226]: Defaulting to hostname 'linux'. Mar 25 01:35:13.320954 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:35:13.325803 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:35:13.367216 kernel: SCSI subsystem initialized Mar 25 01:35:13.377216 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:35:13.388221 kernel: iscsi: registered transport (tcp) Mar 25 01:35:13.410614 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:35:13.410672 kernel: QLogic iSCSI HBA Driver Mar 25 01:35:13.446710 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:35:13.450327 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:35:13.486234 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:35:13.486324 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:35:13.489927 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:35:13.530219 kernel: raid6: avx512x4 gen() 18259 MB/s Mar 25 01:35:13.549210 kernel: raid6: avx512x2 gen() 18071 MB/s Mar 25 01:35:13.568212 kernel: raid6: avx512x1 gen() 18128 MB/s Mar 25 01:35:13.587207 kernel: raid6: avx2x4 gen() 18178 MB/s Mar 25 01:35:13.606210 kernel: raid6: avx2x2 gen() 18166 MB/s Mar 25 01:35:13.626207 kernel: raid6: avx2x1 gen() 13908 MB/s Mar 25 01:35:13.626237 kernel: raid6: using algorithm avx512x4 gen() 18259 MB/s Mar 25 01:35:13.647202 kernel: raid6: .... xor() 7904 MB/s, rmw enabled Mar 25 01:35:13.647242 kernel: raid6: using avx512x2 recovery algorithm Mar 25 01:35:13.670223 kernel: xor: automatically using best checksumming function avx Mar 25 01:35:13.812223 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:35:13.821473 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:35:13.825333 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:35:13.845685 systemd-udevd[397]: Using default interface naming scheme 'v255'. Mar 25 01:35:13.850882 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:35:13.864340 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:35:13.886879 dracut-pre-trigger[413]: rd.md=0: removing MD RAID activation Mar 25 01:35:13.916283 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:35:13.920372 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:35:13.969720 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:35:13.977521 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:35:14.011897 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:35:14.018866 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:35:14.022593 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:35:14.026019 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:35:14.042562 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:35:14.074188 kernel: cryptd: max_cpu_qlen set to 1000 Mar 25 01:35:14.076426 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:35:14.095298 kernel: hv_vmbus: Vmbus version:5.2 Mar 25 01:35:14.095798 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:35:14.098746 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:35:14.116109 kernel: AVX2 version of gcm_enc/dec engaged. Mar 25 01:35:14.116139 kernel: AES CTR mode by8 optimization enabled Mar 25 01:35:14.102316 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:35:14.105675 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:35:14.105833 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:14.126138 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:35:14.135035 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:35:14.139668 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:35:14.150504 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:35:14.151174 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:14.167463 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 25 01:35:14.167518 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it> Mar 25 01:35:14.167983 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:35:14.179219 kernel: PTP clock support registered Mar 25 01:35:14.183241 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 25 01:35:14.186239 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:35:14.193390 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/VMBUS:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 25 01:35:14.193423 kernel: hv_utils: Registering HyperV Utility Driver Mar 25 01:35:14.196996 kernel: hv_vmbus: registering driver hv_utils Mar 25 01:35:14.202992 kernel: hv_utils: Heartbeat IC version 3.0 Mar 25 01:35:14.203038 kernel: hv_utils: Shutdown IC version 3.2 Mar 25 01:35:14.205135 kernel: hv_utils: TimeSync IC version 4.0 Mar 25 01:35:13.971541 systemd-resolved[226]: Clock change detected. Flushing caches. Mar 25 01:35:13.980059 systemd-journald[177]: Time jumped backwards, rotating. Mar 25 01:35:13.986398 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:13.994617 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:35:14.004888 kernel: hv_vmbus: registering driver hv_netvsc Mar 25 01:35:14.008501 kernel: hv_vmbus: registering driver hv_storvsc Mar 25 01:35:14.008528 kernel: hv_vmbus: registering driver hid_hyperv Mar 25 01:35:14.016857 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 25 01:35:14.016912 kernel: scsi host1: storvsc_host_t Mar 25 01:35:14.024044 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 25 01:35:14.028541 kernel: scsi host0: storvsc_host_t Mar 25 01:35:14.028725 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 25 01:35:14.034520 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 25 01:35:14.058378 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 25 01:35:14.064457 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 25 01:35:14.064499 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 25 01:35:14.058562 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:35:14.082675 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 25 01:35:14.098332 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 25 01:35:14.098530 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 25 01:35:14.098693 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 25 01:35:14.098847 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 25 01:35:14.099022 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:35:14.099040 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 25 01:35:14.171508 kernel: hv_netvsc 7c1e5234-9c19-7c1e-5234-9c197c1e5234 eth0: VF slot 1 added Mar 25 01:35:14.179493 kernel: hv_vmbus: registering driver hv_pci Mar 25 01:35:14.179523 kernel: hv_pci 505ee884-4ad1-4a89-84d0-4b2d756a680c: PCI VMBus probing: Using version 0x10004 Mar 25 01:35:14.227418 kernel: hv_pci 505ee884-4ad1-4a89-84d0-4b2d756a680c: PCI host bridge to bus 4ad1:00 Mar 25 01:35:14.228041 kernel: pci_bus 4ad1:00: root bus resource [mem 0xfe0000000-0xfe00fffff window] Mar 25 01:35:14.228225 kernel: pci_bus 4ad1:00: No busn resource found for root bus, will use [bus 00-ff] Mar 25 01:35:14.228380 kernel: pci 4ad1:00:02.0: [15b3:1016] type 00 class 0x020000 Mar 25 01:35:14.228587 kernel: pci 4ad1:00:02.0: reg 0x10: [mem 0xfe0000000-0xfe00fffff 64bit pref] Mar 25 01:35:14.228762 kernel: pci 4ad1:00:02.0: enabling Extended Tags Mar 25 01:35:14.228930 kernel: pci 4ad1:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 4ad1:00:02.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) Mar 25 01:35:14.229103 kernel: pci_bus 4ad1:00: busn_res: [bus 00-ff] end is updated to 00 Mar 25 01:35:14.229257 kernel: pci 4ad1:00:02.0: BAR 0: assigned [mem 0xfe0000000-0xfe00fffff 64bit pref] Mar 25 01:35:14.390158 kernel: mlx5_core 4ad1:00:02.0: enabling device (0000 -> 0002) Mar 25 01:35:14.617662 kernel: mlx5_core 4ad1:00:02.0: firmware version: 14.30.5000 Mar 25 01:35:14.617884 kernel: hv_netvsc 7c1e5234-9c19-7c1e-5234-9c197c1e5234 eth0: VF registering: eth1 Mar 25 01:35:14.618044 kernel: mlx5_core 4ad1:00:02.0 eth1: joined to eth0 Mar 25 01:35:14.618229 kernel: mlx5_core 4ad1:00:02.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Mar 25 01:35:14.630384 kernel: mlx5_core 4ad1:00:02.0 enP19153s1: renamed from eth1 Mar 25 01:35:14.640563 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by (udev-worker) (460) Mar 25 01:35:14.645327 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_Disk EFI-SYSTEM. Mar 25 01:35:14.657284 kernel: BTRFS: device fsid 6d9424cd-1432-492b-b006-b311869817e2 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (452) Mar 25 01:35:14.683722 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_Disk ROOT. Mar 25 01:35:14.700462 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 25 01:35:14.711328 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_Disk USR-A. Mar 25 01:35:14.714843 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_Disk USR-A. Mar 25 01:35:14.733951 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:35:14.761498 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:35:15.774344 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:35:15.776531 disk-uuid[610]: The operation has completed successfully. Mar 25 01:35:15.863532 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:35:15.863656 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:35:15.901659 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:35:15.917972 sh[696]: Success Mar 25 01:35:15.947497 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 25 01:35:16.138669 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:35:16.147570 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:35:16.165370 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:35:16.182500 kernel: BTRFS info (device dm-0): first mount of filesystem 6d9424cd-1432-492b-b006-b311869817e2 Mar 25 01:35:16.182541 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:35:16.188103 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:35:16.191236 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:35:16.193962 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:35:16.477724 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:35:16.483968 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:35:16.490585 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:35:16.500614 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:35:16.527007 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:35:16.527062 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:35:16.527080 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:35:16.557535 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:35:16.564657 kernel: BTRFS info (device sda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:35:16.567953 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:35:16.577630 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:35:16.591711 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:35:16.599606 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:35:16.635360 systemd-networkd[877]: lo: Link UP Mar 25 01:35:16.635370 systemd-networkd[877]: lo: Gained carrier Mar 25 01:35:16.637590 systemd-networkd[877]: Enumeration completed Mar 25 01:35:16.637824 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:35:16.641133 systemd-networkd[877]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:35:16.641137 systemd-networkd[877]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:35:16.643082 systemd[1]: Reached target network.target - Network. Mar 25 01:35:16.696507 kernel: mlx5_core 4ad1:00:02.0 enP19153s1: Link up Mar 25 01:35:16.728116 kernel: hv_netvsc 7c1e5234-9c19-7c1e-5234-9c197c1e5234 eth0: Data path switched to VF: enP19153s1 Mar 25 01:35:16.727612 systemd-networkd[877]: enP19153s1: Link UP Mar 25 01:35:16.727758 systemd-networkd[877]: eth0: Link UP Mar 25 01:35:16.727968 systemd-networkd[877]: eth0: Gained carrier Mar 25 01:35:16.727981 systemd-networkd[877]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:35:16.733701 systemd-networkd[877]: enP19153s1: Gained carrier Mar 25 01:35:16.764535 systemd-networkd[877]: eth0: DHCPv4 address 10.200.8.14/24, gateway 10.200.8.1 acquired from 168.63.129.16 Mar 25 01:35:17.339505 ignition[862]: Ignition 2.20.0 Mar 25 01:35:17.339517 ignition[862]: Stage: fetch-offline Mar 25 01:35:17.341061 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:35:17.339556 ignition[862]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:17.339568 ignition[862]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:17.339690 ignition[862]: parsed url from cmdline: "" Mar 25 01:35:17.339695 ignition[862]: no config URL provided Mar 25 01:35:17.339702 ignition[862]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:35:17.339713 ignition[862]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:35:17.339721 ignition[862]: failed to fetch config: resource requires networking Mar 25 01:35:17.339926 ignition[862]: Ignition finished successfully Mar 25 01:35:17.365622 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:35:17.384970 ignition[886]: Ignition 2.20.0 Mar 25 01:35:17.384982 ignition[886]: Stage: fetch Mar 25 01:35:17.385200 ignition[886]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:17.385214 ignition[886]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:17.385328 ignition[886]: parsed url from cmdline: "" Mar 25 01:35:17.385331 ignition[886]: no config URL provided Mar 25 01:35:17.385336 ignition[886]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:35:17.385344 ignition[886]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:35:17.385367 ignition[886]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 25 01:35:17.482060 ignition[886]: GET result: OK Mar 25 01:35:17.482175 ignition[886]: config has been read from IMDS userdata Mar 25 01:35:17.482212 ignition[886]: parsing config with SHA512: 4c3eabd0cd513e5ab38c9f0d3ae45a618b014b163ebd2e51839bdd7383c121ec84bf26f2d531352d9e1057101d617cf5e5ab5b3ebfd1e93715c62834b80b2230 Mar 25 01:35:17.490876 unknown[886]: fetched base config from "system" Mar 25 01:35:17.490891 unknown[886]: fetched base config from "system" Mar 25 01:35:17.491342 ignition[886]: fetch: fetch complete Mar 25 01:35:17.490899 unknown[886]: fetched user config from "azure" Mar 25 01:35:17.491348 ignition[886]: fetch: fetch passed Mar 25 01:35:17.492962 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:35:17.491397 ignition[886]: Ignition finished successfully Mar 25 01:35:17.509905 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:35:17.533592 ignition[892]: Ignition 2.20.0 Mar 25 01:35:17.533603 ignition[892]: Stage: kargs Mar 25 01:35:17.533806 ignition[892]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:17.533819 ignition[892]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:17.534681 ignition[892]: kargs: kargs passed Mar 25 01:35:17.534724 ignition[892]: Ignition finished successfully Mar 25 01:35:17.542159 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:35:17.550607 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:35:17.573936 ignition[898]: Ignition 2.20.0 Mar 25 01:35:17.575922 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:35:17.573947 ignition[898]: Stage: disks Mar 25 01:35:17.576614 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:35:17.574151 ignition[898]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:17.577751 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:35:17.574164 ignition[898]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:17.577997 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:35:17.575051 ignition[898]: disks: disks passed Mar 25 01:35:17.578450 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:35:17.575095 ignition[898]: Ignition finished successfully Mar 25 01:35:17.578903 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:35:17.582606 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:35:17.645136 systemd-fsck[907]: ROOT: clean, 14/7326000 files, 477710/7359488 blocks Mar 25 01:35:17.651156 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:35:17.658567 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:35:17.756503 kernel: EXT4-fs (sda9): mounted filesystem 4e6dca82-2e50-453c-be25-61f944b72008 r/w with ordered data mode. Quota mode: none. Mar 25 01:35:17.756763 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:35:17.759754 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:35:17.793579 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:35:17.801313 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:35:17.820500 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (918) Mar 25 01:35:17.809271 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 25 01:35:17.812823 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:35:17.842755 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:35:17.842784 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:35:17.842800 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:35:17.842817 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:35:17.812860 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:35:17.822605 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:35:17.844266 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:35:17.852598 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:35:18.452363 coreos-metadata[920]: Mar 25 01:35:18.452 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 25 01:35:18.459240 coreos-metadata[920]: Mar 25 01:35:18.459 INFO Fetch successful Mar 25 01:35:18.462521 coreos-metadata[920]: Mar 25 01:35:18.459 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 25 01:35:18.473170 coreos-metadata[920]: Mar 25 01:35:18.473 INFO Fetch successful Mar 25 01:35:18.475950 coreos-metadata[920]: Mar 25 01:35:18.475 INFO wrote hostname ci-4284.0.0-a-0ecc1f6a74 to /sysroot/etc/hostname Mar 25 01:35:18.477754 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:35:18.498332 initrd-setup-root[949]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:35:18.534030 initrd-setup-root[956]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:35:18.555276 initrd-setup-root[963]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:35:18.560276 initrd-setup-root[970]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:35:18.600700 systemd-networkd[877]: eth0: Gained IPv6LL Mar 25 01:35:18.728753 systemd-networkd[877]: enP19153s1: Gained IPv6LL Mar 25 01:35:19.259694 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:35:19.267456 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:35:19.279616 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:35:19.287989 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:35:19.293374 kernel: BTRFS info (device sda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:35:19.320777 ignition[1038]: INFO : Ignition 2.20.0 Mar 25 01:35:19.323680 ignition[1038]: INFO : Stage: mount Mar 25 01:35:19.325842 ignition[1038]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:19.325842 ignition[1038]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:19.325842 ignition[1038]: INFO : mount: mount passed Mar 25 01:35:19.325842 ignition[1038]: INFO : Ignition finished successfully Mar 25 01:35:19.327733 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:35:19.341996 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:35:19.349090 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:35:19.361850 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:35:19.380497 kernel: BTRFS: device label OEM devid 1 transid 18 /dev/sda6 scanned by mount (1050) Mar 25 01:35:19.380532 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:35:19.384498 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:35:19.389071 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:35:19.394493 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:35:19.395976 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:35:19.422974 ignition[1067]: INFO : Ignition 2.20.0 Mar 25 01:35:19.422974 ignition[1067]: INFO : Stage: files Mar 25 01:35:19.427274 ignition[1067]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:19.427274 ignition[1067]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:19.427274 ignition[1067]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:35:19.442487 ignition[1067]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:35:19.442487 ignition[1067]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:35:19.533036 ignition[1067]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:35:19.537270 ignition[1067]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:35:19.537270 ignition[1067]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:35:19.533558 unknown[1067]: wrote ssh authorized keys file for user: core Mar 25 01:35:19.549713 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Mar 25 01:35:19.555083 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Mar 25 01:35:19.619664 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:35:19.809201 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Mar 25 01:35:19.809201 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:35:19.819601 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Mar 25 01:35:20.299179 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:35:20.605954 ignition[1067]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:35:20.605954 ignition[1067]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:35:20.619530 ignition[1067]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:35:20.625166 ignition[1067]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:35:20.625166 ignition[1067]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:35:20.625166 ignition[1067]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:35:20.638503 ignition[1067]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:35:20.638503 ignition[1067]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:35:20.638503 ignition[1067]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:35:20.638503 ignition[1067]: INFO : files: files passed Mar 25 01:35:20.651996 ignition[1067]: INFO : Ignition finished successfully Mar 25 01:35:20.648214 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:35:20.663134 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:35:20.667819 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:35:20.678085 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:35:20.678303 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:35:20.748968 initrd-setup-root-after-ignition[1096]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:35:20.748968 initrd-setup-root-after-ignition[1096]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:35:20.758096 initrd-setup-root-after-ignition[1100]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:35:20.763778 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:35:20.764505 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:35:20.767615 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:35:20.820673 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:35:20.820797 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:35:20.827886 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:35:20.837048 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:35:20.839940 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:35:20.842604 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:35:20.872718 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:35:20.879632 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:35:20.900135 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:35:20.900430 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:35:20.901065 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:35:20.902110 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:35:20.902250 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:35:20.903033 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:35:20.903549 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:35:20.904059 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:35:20.904590 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:35:20.905065 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:35:20.905608 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:35:20.906055 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:35:20.906551 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:35:20.906974 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:35:20.907399 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:35:20.907778 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:35:20.907915 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:35:20.908807 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:35:20.909283 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:35:20.909699 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:35:20.952277 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:35:21.008831 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:35:21.009044 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:35:21.015281 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:35:21.015410 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:35:21.027623 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:35:21.027766 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:35:21.035615 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 25 01:35:21.035790 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:35:21.046436 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:35:21.049101 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:35:21.051599 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:35:21.061639 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:35:21.064155 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:35:21.064319 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:35:21.069966 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:35:21.070079 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:35:21.093225 ignition[1120]: INFO : Ignition 2.20.0 Mar 25 01:35:21.093225 ignition[1120]: INFO : Stage: umount Mar 25 01:35:21.097569 ignition[1120]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:35:21.097569 ignition[1120]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 25 01:35:21.097983 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:35:21.106695 ignition[1120]: INFO : umount: umount passed Mar 25 01:35:21.106695 ignition[1120]: INFO : Ignition finished successfully Mar 25 01:35:21.098077 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:35:21.114030 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:35:21.114745 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:35:21.114851 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:35:21.121794 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:35:21.121916 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:35:21.124702 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:35:21.124750 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:35:21.125108 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:35:21.125143 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:35:21.125599 systemd[1]: Stopped target network.target - Network. Mar 25 01:35:21.126016 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:35:21.126054 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:35:21.126105 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:35:21.128527 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:35:21.147665 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:35:21.153635 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:35:21.156161 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:35:21.161298 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:35:21.161355 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:35:21.199957 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:35:21.200018 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:35:21.205182 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:35:21.205256 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:35:21.216051 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:35:21.216121 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:35:21.221778 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:35:21.230238 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:35:21.238709 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:35:21.241379 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:35:21.248586 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:35:21.248845 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:35:21.248928 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:35:21.259126 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:35:21.259755 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:35:21.259824 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:35:21.265592 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:35:21.273151 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:35:21.276197 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:35:21.282252 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:35:21.282310 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:35:21.287922 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:35:21.287969 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:35:21.305772 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:35:21.305837 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:35:21.311979 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:35:21.319488 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:35:21.319554 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:35:21.333408 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:35:21.334705 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:35:21.339808 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:35:21.339902 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:35:21.346043 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:35:21.346088 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:35:21.359352 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:35:21.359418 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:35:21.367910 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:35:21.367973 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:35:21.373399 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:35:21.373449 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:35:21.387491 kernel: hv_netvsc 7c1e5234-9c19-7c1e-5234-9c197c1e5234 eth0: Data path switched from VF: enP19153s1 Mar 25 01:35:21.390819 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:35:21.394685 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:35:21.394751 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:35:21.401413 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:35:21.401462 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:21.417932 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:35:21.418017 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:35:21.420869 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:35:21.420988 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:35:21.426029 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:35:21.426120 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:35:21.643669 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:35:21.643818 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:35:21.652440 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:35:21.658315 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:35:21.658387 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:35:21.666603 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:35:22.034966 systemd[1]: Switching root. Mar 25 01:35:22.102592 systemd-journald[177]: Journal stopped Mar 25 01:35:25.497865 systemd-journald[177]: Received SIGTERM from PID 1 (systemd). Mar 25 01:35:25.497911 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:35:25.497938 kernel: SELinux: policy capability open_perms=1 Mar 25 01:35:25.497957 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:35:25.497974 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:35:25.497990 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:35:25.498010 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:35:25.498027 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:35:25.498049 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:35:25.498064 kernel: audit: type=1403 audit(1742866522.832:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:35:25.498084 systemd[1]: Successfully loaded SELinux policy in 120.147ms. Mar 25 01:35:25.498104 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.115ms. Mar 25 01:35:25.498128 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:35:25.498150 systemd[1]: Detected virtualization microsoft. Mar 25 01:35:25.498174 systemd[1]: Detected architecture x86-64. Mar 25 01:35:25.498197 systemd[1]: Detected first boot. Mar 25 01:35:25.498219 systemd[1]: Hostname set to <ci-4284.0.0-a-0ecc1f6a74>. Mar 25 01:35:25.498242 systemd[1]: Initializing machine ID from random generator. Mar 25 01:35:25.498266 zram_generator::config[1165]: No configuration found. Mar 25 01:35:25.498292 kernel: Guest personality initialized and is inactive Mar 25 01:35:25.498312 kernel: VMCI host device registered (name=vmci, major=10, minor=124) Mar 25 01:35:25.498334 kernel: Initialized host personality Mar 25 01:35:25.498352 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:35:25.498373 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:35:25.498393 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:35:25.498411 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:35:25.498431 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:35:25.498454 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:35:25.500515 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:35:25.500548 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:35:25.500564 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:35:25.500580 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:35:25.500596 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:35:25.500613 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:35:25.500628 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:35:25.500652 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:35:25.500668 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:35:25.500685 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:35:25.500701 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:35:25.500718 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:35:25.500740 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:35:25.500759 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:35:25.500776 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:35:25.500798 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:35:25.500815 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:35:25.500832 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:35:25.500851 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:35:25.500869 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:35:25.500890 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:35:25.500908 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:35:25.500930 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:35:25.500947 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:35:25.500966 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:35:25.500984 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:35:25.501002 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:35:25.501021 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:35:25.501044 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:35:25.501063 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:35:25.501081 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:35:25.501099 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:35:25.501117 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:35:25.501136 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:35:25.501154 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:35:25.501176 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:35:25.501195 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:35:25.501213 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:35:25.501232 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:35:25.501251 systemd[1]: Reached target machines.target - Containers. Mar 25 01:35:25.501269 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:35:25.501291 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:35:25.501310 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:35:25.501328 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:35:25.501351 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:35:25.501368 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:35:25.501387 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:35:25.501406 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:35:25.501424 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:35:25.501442 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:35:25.501460 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:35:25.501557 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:35:25.501578 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:35:25.501588 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:35:25.506989 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:35:25.507018 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:35:25.507035 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:35:25.507049 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:35:25.507063 kernel: fuse: init (API version 7.39) Mar 25 01:35:25.507076 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:35:25.507094 kernel: loop: module loaded Mar 25 01:35:25.507105 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:35:25.507143 systemd-journald[1272]: Collecting audit messages is disabled. Mar 25 01:35:25.507173 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:35:25.507189 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:35:25.507202 systemd[1]: Stopped verity-setup.service. Mar 25 01:35:25.507214 systemd-journald[1272]: Journal started Mar 25 01:35:25.507238 systemd-journald[1272]: Runtime Journal (/run/log/journal/ba1cfaf1d1634a41974e91dd47450de8) is 8M, max 158.7M, 150.7M free. Mar 25 01:35:24.923226 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:35:24.932399 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 25 01:35:24.932807 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:35:25.519509 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:35:25.528500 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:35:25.531680 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:35:25.534578 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:35:25.537915 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:35:25.541031 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:35:25.544151 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:35:25.547468 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:35:25.550390 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:35:25.554044 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:35:25.557897 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:35:25.558117 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:35:25.561758 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:35:25.561958 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:35:25.565341 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:35:25.565557 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:35:25.569452 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:35:25.569661 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:35:25.573013 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:35:25.573324 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:35:25.577138 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:35:25.580733 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:35:25.584942 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:35:25.602368 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:35:25.607976 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:35:25.615572 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:35:25.632569 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:35:25.635740 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:35:25.635780 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:35:25.645068 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:35:25.652226 kernel: ACPI: bus type drm_connector registered Mar 25 01:35:25.653917 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:35:25.659710 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:35:25.662815 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:35:25.667707 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:35:25.668668 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:35:25.668852 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:35:25.671639 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:35:25.671717 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:35:25.676982 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:35:25.687763 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:35:25.693465 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:35:25.698964 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:35:25.699241 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:35:25.703673 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:35:25.707124 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:35:25.714549 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:35:25.740679 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:35:25.746061 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:35:25.756862 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:35:25.759631 systemd-journald[1272]: Time spent on flushing to /var/log/journal/ba1cfaf1d1634a41974e91dd47450de8 is 62.149ms for 975 entries. Mar 25 01:35:25.759631 systemd-journald[1272]: System Journal (/var/log/journal/ba1cfaf1d1634a41974e91dd47450de8) is 11.8M, max 2.6G, 2.6G free. Mar 25 01:35:25.943414 systemd-journald[1272]: Received client request to flush runtime journal. Mar 25 01:35:25.943489 kernel: loop0: detected capacity change from 0 to 218376 Mar 25 01:35:25.943520 systemd-journald[1272]: /var/log/journal/ba1cfaf1d1634a41974e91dd47450de8/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. Mar 25 01:35:25.943546 systemd-journald[1272]: Rotating system journal. Mar 25 01:35:25.943570 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:35:25.943587 kernel: loop1: detected capacity change from 0 to 109808 Mar 25 01:35:25.763997 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:35:25.772591 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:35:25.792075 udevadm[1315]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:35:25.832661 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:35:25.945586 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:35:25.954729 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:35:25.955746 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:35:26.004251 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:35:26.012637 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:35:26.112865 systemd-tmpfiles[1327]: ACLs are not supported, ignoring. Mar 25 01:35:26.112891 systemd-tmpfiles[1327]: ACLs are not supported, ignoring. Mar 25 01:35:26.117991 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:35:26.255097 kernel: loop2: detected capacity change from 0 to 151640 Mar 25 01:35:26.613510 kernel: loop3: detected capacity change from 0 to 28424 Mar 25 01:35:26.915537 kernel: loop4: detected capacity change from 0 to 218376 Mar 25 01:35:26.925533 kernel: loop5: detected capacity change from 0 to 109808 Mar 25 01:35:26.936510 kernel: loop6: detected capacity change from 0 to 151640 Mar 25 01:35:26.949527 kernel: loop7: detected capacity change from 0 to 28424 Mar 25 01:35:26.954568 (sd-merge)[1333]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. Mar 25 01:35:26.955132 (sd-merge)[1333]: Merged extensions into '/usr'. Mar 25 01:35:26.958954 systemd[1]: Reload requested from client PID 1306 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:35:26.958974 systemd[1]: Reloading... Mar 25 01:35:27.053982 zram_generator::config[1361]: No configuration found. Mar 25 01:35:27.218998 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:35:27.299312 systemd[1]: Reloading finished in 339 ms. Mar 25 01:35:27.316746 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:35:27.320552 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:35:27.334609 systemd[1]: Starting ensure-sysext.service... Mar 25 01:35:27.339631 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:35:27.345654 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:35:27.374225 systemd-tmpfiles[1421]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:35:27.374634 systemd-tmpfiles[1421]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:35:27.376345 systemd-tmpfiles[1421]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:35:27.376871 systemd-tmpfiles[1421]: ACLs are not supported, ignoring. Mar 25 01:35:27.376948 systemd-tmpfiles[1421]: ACLs are not supported, ignoring. Mar 25 01:35:27.379019 systemd[1]: Reload requested from client PID 1420 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:35:27.379041 systemd[1]: Reloading... Mar 25 01:35:27.382165 systemd-tmpfiles[1421]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:35:27.382179 systemd-tmpfiles[1421]: Skipping /boot Mar 25 01:35:27.411051 systemd-tmpfiles[1421]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:35:27.411065 systemd-tmpfiles[1421]: Skipping /boot Mar 25 01:35:27.426822 systemd-udevd[1422]: Using default interface naming scheme 'v255'. Mar 25 01:35:27.491502 zram_generator::config[1452]: No configuration found. Mar 25 01:35:27.751817 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 01:35:27.753361 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:35:27.831537 kernel: hv_vmbus: registering driver hyperv_fb Mar 25 01:35:27.841515 kernel: hv_vmbus: registering driver hv_balloon Mar 25 01:35:27.880570 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 25 01:35:27.897502 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 25 01:35:27.902497 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 25 01:35:27.914792 kernel: Console: switching to colour dummy device 80x25 Mar 25 01:35:27.914324 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:35:27.914593 systemd[1]: Reloading finished in 534 ms. Mar 25 01:35:27.920401 kernel: Console: switching to colour frame buffer device 128x48 Mar 25 01:35:27.930343 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:35:27.951458 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:35:28.093905 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:35:28.100396 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:35:28.119060 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:35:28.122950 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:35:28.127414 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:35:28.137009 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:35:28.148755 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:35:28.154918 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1525) Mar 25 01:35:28.155051 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:35:28.155409 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:35:28.159107 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:35:28.170065 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:35:28.181758 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:35:28.192979 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:35:28.195934 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:35:28.205962 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:35:28.207529 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:35:28.211007 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:35:28.211195 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:35:28.229944 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:35:28.231053 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:35:28.264600 kernel: kvm_intel: Using Hyper-V Enlightened VMCS Mar 25 01:35:28.282307 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:35:28.287411 systemd[1]: Finished ensure-sysext.service. Mar 25 01:35:28.310848 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:35:28.311146 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:35:28.315334 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:35:28.330721 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:35:28.335702 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:35:28.344699 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:35:28.351540 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:35:28.351606 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:35:28.351703 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:35:28.359730 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:35:28.372223 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:35:28.379965 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:35:28.383517 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:35:28.387984 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:35:28.388205 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:35:28.403097 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:35:28.403352 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:35:28.427973 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:35:28.430692 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:35:28.432353 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:28.443394 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:35:28.444104 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:35:28.450139 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:35:28.450568 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:35:28.469972 augenrules[1650]: No rules Mar 25 01:35:28.478725 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:35:28.481282 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:35:28.534687 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_Disk OEM. Mar 25 01:35:28.538804 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:35:28.544409 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:35:28.548926 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:35:28.553648 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:35:28.556020 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:35:28.565187 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:35:28.630681 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:35:28.638645 lvm[1662]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:35:28.678029 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:35:28.678400 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:35:28.681916 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:35:28.694040 lvm[1677]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:35:28.719089 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:35:28.731665 systemd-resolved[1578]: Positive Trust Anchors: Mar 25 01:35:28.731679 systemd-resolved[1578]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:35:28.731764 systemd-resolved[1578]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:35:28.735950 systemd-resolved[1578]: Using system hostname 'ci-4284.0.0-a-0ecc1f6a74'. Mar 25 01:35:28.737317 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:35:28.737616 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:35:28.771937 systemd-networkd[1573]: lo: Link UP Mar 25 01:35:28.771945 systemd-networkd[1573]: lo: Gained carrier Mar 25 01:35:28.774696 systemd-networkd[1573]: Enumeration completed Mar 25 01:35:28.774803 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:35:28.775056 systemd-networkd[1573]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:35:28.775061 systemd-networkd[1573]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:35:28.778290 systemd[1]: Reached target network.target - Network. Mar 25 01:35:28.780890 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:35:28.786716 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:35:28.842452 kernel: mlx5_core 4ad1:00:02.0 enP19153s1: Link up Mar 25 01:35:28.840673 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:35:28.863511 kernel: hv_netvsc 7c1e5234-9c19-7c1e-5234-9c197c1e5234 eth0: Data path switched to VF: enP19153s1 Mar 25 01:35:28.865755 systemd-networkd[1573]: enP19153s1: Link UP Mar 25 01:35:28.866842 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:35:28.866918 systemd-networkd[1573]: eth0: Link UP Mar 25 01:35:28.866922 systemd-networkd[1573]: eth0: Gained carrier Mar 25 01:35:28.866937 systemd-networkd[1573]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:35:28.874790 systemd-networkd[1573]: enP19153s1: Gained carrier Mar 25 01:35:28.914560 systemd-networkd[1573]: eth0: DHCPv4 address 10.200.8.14/24, gateway 10.200.8.1 acquired from 168.63.129.16 Mar 25 01:35:29.524181 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:35:29.528603 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:35:30.312665 systemd-networkd[1573]: enP19153s1: Gained IPv6LL Mar 25 01:35:30.631871 ldconfig[1301]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:35:30.643061 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:35:30.648205 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:35:30.673199 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:35:30.677067 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:35:30.680354 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:35:30.683898 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:35:30.687688 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:35:30.690722 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:35:30.694373 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:35:30.697832 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:35:30.697874 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:35:30.700277 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:35:30.717431 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:35:30.721971 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:35:30.727497 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:35:30.731182 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:35:30.734740 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:35:30.745154 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:35:30.749069 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:35:30.753145 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:35:30.756249 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:35:30.758979 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:35:30.761668 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:35:30.761703 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:35:30.761774 systemd-networkd[1573]: eth0: Gained IPv6LL Mar 25 01:35:30.765011 systemd[1]: Starting chronyd.service - NTP client/server... Mar 25 01:35:30.772576 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:35:30.781616 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:35:30.786217 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:35:30.792618 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:35:30.798694 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:35:30.803547 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:35:30.803609 systemd[1]: hv_fcopy_daemon.service - Hyper-V FCOPY daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_fcopy). Mar 25 01:35:30.807719 systemd[1]: Started hv_kvp_daemon.service - Hyper-V KVP daemon. Mar 25 01:35:30.809783 jq[1696]: false Mar 25 01:35:30.811020 systemd[1]: hv_vss_daemon.service - Hyper-V VSS daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/vmbus/hv_vss). Mar 25 01:35:30.813677 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:35:30.820669 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:35:30.826687 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:35:30.843634 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:35:30.860650 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:35:30.864970 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:35:30.865690 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:35:30.868566 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:35:30.868640 KVP[1701]: KVP starting; pid is:1701 Mar 25 01:35:30.877530 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:35:30.878434 KVP[1701]: KVP LIC Version: 3.1 Mar 25 01:35:30.879881 (chronyd)[1692]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS Mar 25 01:35:30.882498 kernel: hv_utils: KVP IC version 4.0 Mar 25 01:35:30.884163 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:35:30.892010 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:35:30.892719 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:35:30.894347 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:35:30.894955 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:35:30.913007 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:35:30.918160 chronyd[1727]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) Mar 25 01:35:30.923114 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:35:30.934308 jq[1711]: true Mar 25 01:35:30.935970 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:35:30.939543 chronyd[1727]: Timezone right/UTC failed leap second check, ignoring Mar 25 01:35:30.939744 chronyd[1727]: Loaded seccomp filter (level 2) Mar 25 01:35:30.947111 systemd[1]: Started chronyd.service - NTP client/server. Mar 25 01:35:30.949399 dbus-daemon[1695]: [system] SELinux support is enabled Mar 25 01:35:30.952862 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:35:30.955403 extend-filesystems[1700]: Found loop4 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found loop5 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found loop6 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found loop7 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda1 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda2 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda3 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found usr Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda4 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda6 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda7 Mar 25 01:35:30.959933 extend-filesystems[1700]: Found sda9 Mar 25 01:35:30.959933 extend-filesystems[1700]: Checking size of /dev/sda9 Mar 25 01:35:30.974636 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:35:31.009072 update_engine[1709]: I20250325 01:35:30.999650 1709 main.cc:92] Flatcar Update Engine starting Mar 25 01:35:31.009072 update_engine[1709]: I20250325 01:35:31.002825 1709 update_check_scheduler.cc:74] Next update check in 7m17s Mar 25 01:35:30.974883 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:35:31.020792 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:35:31.044674 tar[1716]: linux-amd64/LICENSE Mar 25 01:35:31.044674 tar[1716]: linux-amd64/helm Mar 25 01:35:31.045010 extend-filesystems[1700]: Old size kept for /dev/sda9 Mar 25 01:35:31.045010 extend-filesystems[1700]: Found sr0 Mar 25 01:35:31.020839 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:35:31.026952 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:35:31.026975 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:35:31.032035 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:35:31.033836 (ntainerd)[1737]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:35:31.035808 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:35:31.036298 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:35:31.067782 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:35:31.079540 systemd-logind[1706]: New seat seat0. Mar 25 01:35:31.089427 systemd-logind[1706]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 25 01:35:31.089659 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:35:31.095438 jq[1735]: true Mar 25 01:35:31.141742 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:35:31.167089 coreos-metadata[1694]: Mar 25 01:35:31.166 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 25 01:35:31.173292 coreos-metadata[1694]: Mar 25 01:35:31.172 INFO Fetch successful Mar 25 01:35:31.173292 coreos-metadata[1694]: Mar 25 01:35:31.173 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 Mar 25 01:35:31.180504 coreos-metadata[1694]: Mar 25 01:35:31.178 INFO Fetch successful Mar 25 01:35:31.181315 coreos-metadata[1694]: Mar 25 01:35:31.180 INFO Fetching http://168.63.129.16/machine/18fba4e2-0ed1-4e8e-8fc1-75948a224425/6faf3ac1%2D977f%2D4ab7%2D872c%2D2e7dc55ae4a7.%5Fci%2D4284.0.0%2Da%2D0ecc1f6a74?comp=config&type=sharedConfig&incarnation=1: Attempt #1 Mar 25 01:35:31.195763 coreos-metadata[1694]: Mar 25 01:35:31.195 INFO Fetch successful Mar 25 01:35:31.196421 coreos-metadata[1694]: Mar 25 01:35:31.196 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 Mar 25 01:35:31.219591 coreos-metadata[1694]: Mar 25 01:35:31.219 INFO Fetch successful Mar 25 01:35:31.265287 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1525) Mar 25 01:35:31.288551 bash[1777]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:35:31.290473 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:35:31.301941 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 25 01:35:31.329533 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:35:31.333457 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:35:31.936658 sshd_keygen[1715]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:35:31.994025 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:35:32.003296 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:35:32.008148 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... Mar 25 01:35:32.041835 tar[1716]: linux-amd64/README.md Mar 25 01:35:32.048875 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:35:32.049150 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:35:32.060998 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:35:32.076341 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. Mar 25 01:35:32.082305 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:35:32.092808 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:35:32.097797 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:35:32.104812 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:35:32.108087 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:35:32.174196 containerd[1737]: time="2025-03-25T01:35:32Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:35:32.176321 containerd[1737]: time="2025-03-25T01:35:32.174988600Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:35:32.184969 containerd[1737]: time="2025-03-25T01:35:32.184926800Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.8µs" Mar 25 01:35:32.184969 containerd[1737]: time="2025-03-25T01:35:32.184955500Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:35:32.185097 containerd[1737]: time="2025-03-25T01:35:32.184981900Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:35:32.185164 containerd[1737]: time="2025-03-25T01:35:32.185136300Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:35:32.185212 containerd[1737]: time="2025-03-25T01:35:32.185162600Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:35:32.185212 containerd[1737]: time="2025-03-25T01:35:32.185204500Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185308 containerd[1737]: time="2025-03-25T01:35:32.185281600Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185308 containerd[1737]: time="2025-03-25T01:35:32.185300700Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185580 containerd[1737]: time="2025-03-25T01:35:32.185551000Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185580 containerd[1737]: time="2025-03-25T01:35:32.185571900Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185673 containerd[1737]: time="2025-03-25T01:35:32.185587700Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185673 containerd[1737]: time="2025-03-25T01:35:32.185598600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185749 containerd[1737]: time="2025-03-25T01:35:32.185721600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:35:32.185997 containerd[1737]: time="2025-03-25T01:35:32.185966000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:35:32.186055 containerd[1737]: time="2025-03-25T01:35:32.186008400Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:35:32.186055 containerd[1737]: time="2025-03-25T01:35:32.186024800Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:35:32.186117 containerd[1737]: time="2025-03-25T01:35:32.186069600Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:35:32.186357 containerd[1737]: time="2025-03-25T01:35:32.186328800Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:35:32.186443 containerd[1737]: time="2025-03-25T01:35:32.186416100Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:35:32.205227 containerd[1737]: time="2025-03-25T01:35:32.205148600Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205357000Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205386000Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205416300Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205437200Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205455100Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205498100Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205522000Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205539100Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205556200Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205570800Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205589800Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205726400Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205752600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:35:32.206394 containerd[1737]: time="2025-03-25T01:35:32.205769900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205787400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205802000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205815700Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205832000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205848500Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205864100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205878900Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205893400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205970000Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.205989500Z" level=info msg="Start snapshots syncer" Mar 25 01:35:32.207915 containerd[1737]: time="2025-03-25T01:35:32.206013800Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:35:32.208294 containerd[1737]: time="2025-03-25T01:35:32.206830500Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:35:32.208294 containerd[1737]: time="2025-03-25T01:35:32.206908300Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207348600Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207545700Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207587200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207606800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207630700Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207654500Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207675400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207692600Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207730400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207754500Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:35:32.208490 containerd[1737]: time="2025-03-25T01:35:32.207773100Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208603100Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208655600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208672700Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208692700Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208724200Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208741800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208762900Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208803300Z" level=info msg="runtime interface created" Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208812100Z" level=info msg="created NRI interface" Mar 25 01:35:32.208842 containerd[1737]: time="2025-03-25T01:35:32.208830900Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:35:32.209147 containerd[1737]: time="2025-03-25T01:35:32.208849900Z" level=info msg="Connect containerd service" Mar 25 01:35:32.209147 containerd[1737]: time="2025-03-25T01:35:32.208916400Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:35:32.211452 containerd[1737]: time="2025-03-25T01:35:32.210800300Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:35:32.611146 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:35:32.615978 (kubelet)[1871]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933401900Z" level=info msg="Start subscribing containerd event" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933488800Z" level=info msg="Start recovering state" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933615700Z" level=info msg="Start event monitor" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933635700Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933646400Z" level=info msg="Start streaming server" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933659100Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933668300Z" level=info msg="runtime interface starting up..." Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933677200Z" level=info msg="starting plugins..." Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933695500Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933672400Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:35:32.938399 containerd[1737]: time="2025-03-25T01:35:32.933844400Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:35:32.934034 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:35:32.940961 containerd[1737]: time="2025-03-25T01:35:32.939168000Z" level=info msg="containerd successfully booted in 0.765724s" Mar 25 01:35:32.943170 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:35:32.948230 systemd[1]: Startup finished in 770ms (firmware) + 26.234s (loader) + 972ms (kernel) + 10.254s (initrd) + 10.233s (userspace) = 48.467s. Mar 25 01:35:33.243283 kubelet[1871]: E0325 01:35:33.243165 1871 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:35:33.245461 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:35:33.245679 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:35:33.246219 systemd[1]: kubelet.service: Consumed 938ms CPU time, 251M memory peak. Mar 25 01:35:33.282766 login[1858]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying Mar 25 01:35:33.283248 login[1857]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 01:35:33.298013 systemd-logind[1706]: New session 1 of user core. Mar 25 01:35:33.299692 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:35:33.301074 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:35:33.322991 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:35:33.328769 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:35:33.353850 (systemd)[1891]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:35:33.356284 systemd-logind[1706]: New session c1 of user core. Mar 25 01:35:33.521946 locksmithd[1748]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:35:33.580059 systemd[1891]: Queued start job for default target default.target. Mar 25 01:35:33.587559 systemd[1891]: Created slice app.slice - User Application Slice. Mar 25 01:35:33.587589 systemd[1891]: Reached target paths.target - Paths. Mar 25 01:35:33.587628 systemd[1891]: Reached target timers.target - Timers. Mar 25 01:35:33.589095 systemd[1891]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:35:33.602186 systemd[1891]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:35:33.602510 systemd[1891]: Reached target sockets.target - Sockets. Mar 25 01:35:33.602748 systemd[1891]: Reached target basic.target - Basic System. Mar 25 01:35:33.602823 systemd[1891]: Reached target default.target - Main User Target. Mar 25 01:35:33.602860 systemd[1891]: Startup finished in 240ms. Mar 25 01:35:33.603300 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:35:33.608681 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:35:33.713883 waagent[1854]: 2025-03-25T01:35:33.713802Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.714191Z INFO Daemon Daemon OS: flatcar 4284.0.0 Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.715279Z INFO Daemon Daemon Python: 3.11.11 Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.716537Z INFO Daemon Daemon Run daemon Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.716917Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4284.0.0' Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.717302Z INFO Daemon Daemon Using waagent for provisioning Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.718375Z INFO Daemon Daemon Activate resource disk Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.719165Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.723566Z INFO Daemon Daemon Found device: None Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.724502Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.724953Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.725857Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 25 01:35:33.751901 waagent[1854]: 2025-03-25T01:35:33.726947Z INFO Daemon Daemon Running default provisioning handler Mar 25 01:35:33.755487 waagent[1854]: 2025-03-25T01:35:33.755141Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. Mar 25 01:35:33.762183 waagent[1854]: 2025-03-25T01:35:33.762138Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 25 01:35:33.766732 waagent[1854]: 2025-03-25T01:35:33.766687Z INFO Daemon Daemon cloud-init is enabled: False Mar 25 01:35:33.771330 waagent[1854]: 2025-03-25T01:35:33.766850Z INFO Daemon Daemon Copying ovf-env.xml Mar 25 01:35:33.855974 waagent[1854]: 2025-03-25T01:35:33.853052Z INFO Daemon Daemon Successfully mounted dvd Mar 25 01:35:33.880729 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 25 01:35:33.882641 waagent[1854]: 2025-03-25T01:35:33.882361Z INFO Daemon Daemon Detect protocol endpoint Mar 25 01:35:33.898002 waagent[1854]: 2025-03-25T01:35:33.884961Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 25 01:35:33.898002 waagent[1854]: 2025-03-25T01:35:33.885173Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 25 01:35:33.898002 waagent[1854]: 2025-03-25T01:35:33.886145Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 25 01:35:33.898002 waagent[1854]: 2025-03-25T01:35:33.887153Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 25 01:35:33.898002 waagent[1854]: 2025-03-25T01:35:33.887946Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 25 01:35:33.901025 waagent[1854]: 2025-03-25T01:35:33.900981Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 25 01:35:33.909263 waagent[1854]: 2025-03-25T01:35:33.901331Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 25 01:35:33.909263 waagent[1854]: 2025-03-25T01:35:33.902130Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 25 01:35:34.016954 waagent[1854]: 2025-03-25T01:35:34.016859Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 25 01:35:34.020914 waagent[1854]: 2025-03-25T01:35:34.020854Z INFO Daemon Daemon Forcing an update of the goal state. Mar 25 01:35:34.027638 waagent[1854]: 2025-03-25T01:35:34.027587Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 25 01:35:34.043741 waagent[1854]: 2025-03-25T01:35:34.043695Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 Mar 25 01:35:34.062833 waagent[1854]: 2025-03-25T01:35:34.044322Z INFO Daemon Mar 25 01:35:34.062833 waagent[1854]: 2025-03-25T01:35:34.044905Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: ef7572d7-d1f4-45eb-8eba-fe14e0251d2e eTag: 12302763063386066403 source: Fabric] Mar 25 01:35:34.062833 waagent[1854]: 2025-03-25T01:35:34.045615Z INFO Daemon The vmSettings originated via Fabric; will ignore them. Mar 25 01:35:34.062833 waagent[1854]: 2025-03-25T01:35:34.046019Z INFO Daemon Mar 25 01:35:34.062833 waagent[1854]: 2025-03-25T01:35:34.046696Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] Mar 25 01:35:34.062833 waagent[1854]: 2025-03-25T01:35:34.051540Z INFO Daemon Daemon Downloading artifacts profile blob Mar 25 01:35:34.141897 waagent[1854]: 2025-03-25T01:35:34.141776Z INFO Daemon Downloaded certificate {'thumbprint': 'EB2A6C7C6ABD7010BD4D81056442C85A10ABABD1', 'hasPrivateKey': True} Mar 25 01:35:34.147164 waagent[1854]: 2025-03-25T01:35:34.147109Z INFO Daemon Downloaded certificate {'thumbprint': '271DD035DC2A91B251141A7A337E1BBB9E3A36DF', 'hasPrivateKey': False} Mar 25 01:35:34.152281 waagent[1854]: 2025-03-25T01:35:34.152227Z INFO Daemon Fetch goal state completed Mar 25 01:35:34.162566 waagent[1854]: 2025-03-25T01:35:34.162524Z INFO Daemon Daemon Starting provisioning Mar 25 01:35:34.165200 waagent[1854]: 2025-03-25T01:35:34.165060Z INFO Daemon Daemon Handle ovf-env.xml. Mar 25 01:35:34.165200 waagent[1854]: 2025-03-25T01:35:34.165229Z INFO Daemon Daemon Set hostname [ci-4284.0.0-a-0ecc1f6a74] Mar 25 01:35:34.194512 waagent[1854]: 2025-03-25T01:35:34.194417Z INFO Daemon Daemon Publish hostname [ci-4284.0.0-a-0ecc1f6a74] Mar 25 01:35:34.202572 waagent[1854]: 2025-03-25T01:35:34.194935Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 25 01:35:34.202572 waagent[1854]: 2025-03-25T01:35:34.195915Z INFO Daemon Daemon Primary interface is [eth0] Mar 25 01:35:34.205296 systemd-networkd[1573]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:35:34.205306 systemd-networkd[1573]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:35:34.205339 systemd-networkd[1573]: eth0: DHCP lease lost Mar 25 01:35:34.206501 waagent[1854]: 2025-03-25T01:35:34.206440Z INFO Daemon Daemon Create user account if not exists Mar 25 01:35:34.223994 waagent[1854]: 2025-03-25T01:35:34.206831Z INFO Daemon Daemon User core already exists, skip useradd Mar 25 01:35:34.223994 waagent[1854]: 2025-03-25T01:35:34.207381Z INFO Daemon Daemon Configure sudoer Mar 25 01:35:34.223994 waagent[1854]: 2025-03-25T01:35:34.208565Z INFO Daemon Daemon Configure sshd Mar 25 01:35:34.223994 waagent[1854]: 2025-03-25T01:35:34.208949Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. Mar 25 01:35:34.223994 waagent[1854]: 2025-03-25T01:35:34.209948Z INFO Daemon Daemon Deploy ssh public key. Mar 25 01:35:34.245534 systemd-networkd[1573]: eth0: DHCPv4 address 10.200.8.14/24, gateway 10.200.8.1 acquired from 168.63.129.16 Mar 25 01:35:34.284331 login[1858]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 01:35:34.289194 systemd-logind[1706]: New session 2 of user core. Mar 25 01:35:34.298620 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:35:35.324497 waagent[1854]: 2025-03-25T01:35:35.324396Z INFO Daemon Daemon Provisioning complete Mar 25 01:35:35.339147 waagent[1854]: 2025-03-25T01:35:35.339087Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 25 01:35:35.346820 waagent[1854]: 2025-03-25T01:35:35.339558Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 25 01:35:35.346820 waagent[1854]: 2025-03-25T01:35:35.340462Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent Mar 25 01:35:35.467729 waagent[1950]: 2025-03-25T01:35:35.467637Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) Mar 25 01:35:35.468111 waagent[1950]: 2025-03-25T01:35:35.467807Z INFO ExtHandler ExtHandler OS: flatcar 4284.0.0 Mar 25 01:35:35.468111 waagent[1950]: 2025-03-25T01:35:35.467883Z INFO ExtHandler ExtHandler Python: 3.11.11 Mar 25 01:35:35.468111 waagent[1950]: 2025-03-25T01:35:35.467959Z INFO ExtHandler ExtHandler CPU Arch: x86_64 Mar 25 01:35:35.515720 waagent[1950]: 2025-03-25T01:35:35.515628Z INFO ExtHandler ExtHandler Distro: flatcar-4284.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.11; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 25 01:35:35.515976 waagent[1950]: 2025-03-25T01:35:35.515926Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 25 01:35:35.516072 waagent[1950]: 2025-03-25T01:35:35.516039Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 25 01:35:35.523430 waagent[1950]: 2025-03-25T01:35:35.523359Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 25 01:35:35.529019 waagent[1950]: 2025-03-25T01:35:35.528970Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 Mar 25 01:35:35.529468 waagent[1950]: 2025-03-25T01:35:35.529418Z INFO ExtHandler Mar 25 01:35:35.529568 waagent[1950]: 2025-03-25T01:35:35.529519Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: defc7f40-762e-4c48-94fc-c6ac9a2685a4 eTag: 12302763063386066403 source: Fabric] Mar 25 01:35:35.529877 waagent[1950]: 2025-03-25T01:35:35.529829Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 25 01:35:35.530402 waagent[1950]: 2025-03-25T01:35:35.530352Z INFO ExtHandler Mar 25 01:35:35.530488 waagent[1950]: 2025-03-25T01:35:35.530425Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 25 01:35:35.534343 waagent[1950]: 2025-03-25T01:35:35.534302Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 25 01:35:35.609769 waagent[1950]: 2025-03-25T01:35:35.609640Z INFO ExtHandler Downloaded certificate {'thumbprint': 'EB2A6C7C6ABD7010BD4D81056442C85A10ABABD1', 'hasPrivateKey': True} Mar 25 01:35:35.610130 waagent[1950]: 2025-03-25T01:35:35.610082Z INFO ExtHandler Downloaded certificate {'thumbprint': '271DD035DC2A91B251141A7A337E1BBB9E3A36DF', 'hasPrivateKey': False} Mar 25 01:35:35.610573 waagent[1950]: 2025-03-25T01:35:35.610529Z INFO ExtHandler Fetch goal state completed Mar 25 01:35:35.625440 waagent[1950]: 2025-03-25T01:35:35.625384Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) Mar 25 01:35:35.630162 waagent[1950]: 2025-03-25T01:35:35.630108Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1950 Mar 25 01:35:35.630298 waagent[1950]: 2025-03-25T01:35:35.630262Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** Mar 25 01:35:35.630652 waagent[1950]: 2025-03-25T01:35:35.630611Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** Mar 25 01:35:35.632045 waagent[1950]: 2025-03-25T01:35:35.631997Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4284.0.0', '', 'Flatcar Container Linux by Kinvolk'] Mar 25 01:35:35.632446 waagent[1950]: 2025-03-25T01:35:35.632402Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4284.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 25 01:35:35.632617 waagent[1950]: 2025-03-25T01:35:35.632580Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 25 01:35:35.633175 waagent[1950]: 2025-03-25T01:35:35.633132Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 25 01:35:35.664091 waagent[1950]: 2025-03-25T01:35:35.664049Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 25 01:35:35.664292 waagent[1950]: 2025-03-25T01:35:35.664250Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 25 01:35:35.671086 waagent[1950]: 2025-03-25T01:35:35.670790Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 25 01:35:35.677377 systemd[1]: Reload requested from client PID 1967 ('systemctl') (unit waagent.service)... Mar 25 01:35:35.677394 systemd[1]: Reloading... Mar 25 01:35:35.772525 zram_generator::config[2003]: No configuration found. Mar 25 01:35:35.903738 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:35:36.018404 systemd[1]: Reloading finished in 340 ms. Mar 25 01:35:36.035716 waagent[1950]: 2025-03-25T01:35:36.035085Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service Mar 25 01:35:36.035716 waagent[1950]: 2025-03-25T01:35:36.035270Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully Mar 25 01:35:36.368700 waagent[1950]: 2025-03-25T01:35:36.368606Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 25 01:35:36.369089 waagent[1950]: 2025-03-25T01:35:36.369032Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 25 01:35:36.369992 waagent[1950]: 2025-03-25T01:35:36.369922Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 25 01:35:36.370139 waagent[1950]: 2025-03-25T01:35:36.370086Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 25 01:35:36.370306 waagent[1950]: 2025-03-25T01:35:36.370256Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 25 01:35:36.370828 waagent[1950]: 2025-03-25T01:35:36.370769Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 25 01:35:36.370983 waagent[1950]: 2025-03-25T01:35:36.370937Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 25 01:35:36.371956 waagent[1950]: 2025-03-25T01:35:36.371894Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 25 01:35:36.372060 waagent[1950]: 2025-03-25T01:35:36.371997Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 25 01:35:36.372140 waagent[1950]: 2025-03-25T01:35:36.372057Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 25 01:35:36.372465 waagent[1950]: 2025-03-25T01:35:36.372409Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 25 01:35:36.372830 waagent[1950]: 2025-03-25T01:35:36.372753Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 25 01:35:36.373050 waagent[1950]: 2025-03-25T01:35:36.372994Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 25 01:35:36.373640 waagent[1950]: 2025-03-25T01:35:36.373592Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 25 01:35:36.373850 waagent[1950]: 2025-03-25T01:35:36.373801Z INFO EnvHandler ExtHandler Configure routes Mar 25 01:35:36.373916 waagent[1950]: 2025-03-25T01:35:36.373875Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 25 01:35:36.373916 waagent[1950]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 25 01:35:36.373916 waagent[1950]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Mar 25 01:35:36.373916 waagent[1950]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 25 01:35:36.373916 waagent[1950]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 25 01:35:36.373916 waagent[1950]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 25 01:35:36.373916 waagent[1950]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 25 01:35:36.375039 waagent[1950]: 2025-03-25T01:35:36.374987Z INFO EnvHandler ExtHandler Gateway:None Mar 25 01:35:36.375676 waagent[1950]: 2025-03-25T01:35:36.375630Z INFO EnvHandler ExtHandler Routes:None Mar 25 01:35:36.381801 waagent[1950]: 2025-03-25T01:35:36.381755Z INFO ExtHandler ExtHandler Mar 25 01:35:36.381888 waagent[1950]: 2025-03-25T01:35:36.381856Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 5e85a4e8-3f6e-4178-aa4d-a7b75a120556 correlation cb2bf197-9887-4ff4-ac1c-7d47cb93b1d3 created: 2025-03-25T01:34:32.988724Z] Mar 25 01:35:36.382885 waagent[1950]: 2025-03-25T01:35:36.382838Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 25 01:35:36.385028 waagent[1950]: 2025-03-25T01:35:36.384992Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 3 ms] Mar 25 01:35:36.418084 waagent[1950]: 2025-03-25T01:35:36.418022Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 1452B76D-AA30-4CA8-995D-968F445A27D8;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] Mar 25 01:35:36.423915 waagent[1950]: 2025-03-25T01:35:36.423856Z INFO MonitorHandler ExtHandler Network interfaces: Mar 25 01:35:36.423915 waagent[1950]: Executing ['ip', '-a', '-o', 'link']: Mar 25 01:35:36.423915 waagent[1950]: 1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 25 01:35:36.423915 waagent[1950]: 2: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:34:9c:19 brd ff:ff:ff:ff:ff:ff Mar 25 01:35:36.423915 waagent[1950]: 3: enP19153s1: <BROADCAST,MULTICAST,SLAVE,UP,LOWER_UP> mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:34:9c:19 brd ff:ff:ff:ff:ff:ff\ altname enP19153p0s2 Mar 25 01:35:36.423915 waagent[1950]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 25 01:35:36.423915 waagent[1950]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 25 01:35:36.423915 waagent[1950]: 2: eth0 inet 10.200.8.14/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 25 01:35:36.423915 waagent[1950]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 25 01:35:36.423915 waagent[1950]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever Mar 25 01:35:36.423915 waagent[1950]: 2: eth0 inet6 fe80::7e1e:52ff:fe34:9c19/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 25 01:35:36.423915 waagent[1950]: 3: enP19153s1 inet6 fe80::7e1e:52ff:fe34:9c19/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever Mar 25 01:35:36.484656 waagent[1950]: 2025-03-25T01:35:36.484594Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 25 01:35:36.484656 waagent[1950]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 25 01:35:36.484656 waagent[1950]: pkts bytes target prot opt in out source destination Mar 25 01:35:36.484656 waagent[1950]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 25 01:35:36.484656 waagent[1950]: pkts bytes target prot opt in out source destination Mar 25 01:35:36.484656 waagent[1950]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 25 01:35:36.484656 waagent[1950]: pkts bytes target prot opt in out source destination Mar 25 01:35:36.484656 waagent[1950]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 25 01:35:36.484656 waagent[1950]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 25 01:35:36.484656 waagent[1950]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 25 01:35:36.488063 waagent[1950]: 2025-03-25T01:35:36.488006Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 25 01:35:36.488063 waagent[1950]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 25 01:35:36.488063 waagent[1950]: pkts bytes target prot opt in out source destination Mar 25 01:35:36.488063 waagent[1950]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 25 01:35:36.488063 waagent[1950]: pkts bytes target prot opt in out source destination Mar 25 01:35:36.488063 waagent[1950]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 25 01:35:36.488063 waagent[1950]: pkts bytes target prot opt in out source destination Mar 25 01:35:36.488063 waagent[1950]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 25 01:35:36.488063 waagent[1950]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 25 01:35:36.488063 waagent[1950]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 25 01:35:36.488493 waagent[1950]: 2025-03-25T01:35:36.488309Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 25 01:35:43.419013 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:35:43.421168 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:35:43.550086 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:35:43.556801 (kubelet)[2104]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:35:44.210884 kubelet[2104]: E0325 01:35:44.210827 2104 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:35:44.214330 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:35:44.214545 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:35:44.214953 systemd[1]: kubelet.service: Consumed 167ms CPU time, 104.8M memory peak. Mar 25 01:35:49.985227 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:35:49.986715 systemd[1]: Started sshd@0-10.200.8.14:22-10.200.16.10:47158.service - OpenSSH per-connection server daemon (10.200.16.10:47158). Mar 25 01:35:50.690529 sshd[2113]: Accepted publickey for core from 10.200.16.10 port 47158 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:50.692163 sshd-session[2113]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:50.698084 systemd-logind[1706]: New session 3 of user core. Mar 25 01:35:50.707646 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:35:51.245386 systemd[1]: Started sshd@1-10.200.8.14:22-10.200.16.10:47160.service - OpenSSH per-connection server daemon (10.200.16.10:47160). Mar 25 01:35:51.888563 sshd[2118]: Accepted publickey for core from 10.200.16.10 port 47160 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:51.890193 sshd-session[2118]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:51.894908 systemd-logind[1706]: New session 4 of user core. Mar 25 01:35:51.901648 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:35:52.341455 sshd[2120]: Connection closed by 10.200.16.10 port 47160 Mar 25 01:35:52.342310 sshd-session[2118]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:52.345782 systemd[1]: sshd@1-10.200.8.14:22-10.200.16.10:47160.service: Deactivated successfully. Mar 25 01:35:52.348176 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:35:52.350113 systemd-logind[1706]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:35:52.351239 systemd-logind[1706]: Removed session 4. Mar 25 01:35:52.472225 systemd[1]: Started sshd@2-10.200.8.14:22-10.200.16.10:47166.service - OpenSSH per-connection server daemon (10.200.16.10:47166). Mar 25 01:35:53.103888 sshd[2126]: Accepted publickey for core from 10.200.16.10 port 47166 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:53.106176 sshd-session[2126]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:53.110739 systemd-logind[1706]: New session 5 of user core. Mar 25 01:35:53.121621 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:35:53.551764 sshd[2128]: Connection closed by 10.200.16.10 port 47166 Mar 25 01:35:53.552619 sshd-session[2126]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:53.556899 systemd[1]: sshd@2-10.200.8.14:22-10.200.16.10:47166.service: Deactivated successfully. Mar 25 01:35:53.558754 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:35:53.559502 systemd-logind[1706]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:35:53.560404 systemd-logind[1706]: Removed session 5. Mar 25 01:35:53.662220 systemd[1]: Started sshd@3-10.200.8.14:22-10.200.16.10:47170.service - OpenSSH per-connection server daemon (10.200.16.10:47170). Mar 25 01:35:54.300114 sshd[2134]: Accepted publickey for core from 10.200.16.10 port 47170 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:54.301771 sshd-session[2134]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:54.302882 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:35:54.305643 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:35:54.309839 systemd-logind[1706]: New session 6 of user core. Mar 25 01:35:54.318891 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:35:54.430115 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:35:54.439813 (kubelet)[2145]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:35:54.743096 sshd[2139]: Connection closed by 10.200.16.10 port 47170 Mar 25 01:35:54.743882 sshd-session[2134]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:54.746896 systemd[1]: sshd@3-10.200.8.14:22-10.200.16.10:47170.service: Deactivated successfully. Mar 25 01:35:54.748943 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:35:54.750469 systemd-logind[1706]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:35:54.751648 systemd-logind[1706]: Removed session 6. Mar 25 01:35:54.754560 chronyd[1727]: Selected source PHC0 Mar 25 01:35:54.854587 systemd[1]: Started sshd@4-10.200.8.14:22-10.200.16.10:47176.service - OpenSSH per-connection server daemon (10.200.16.10:47176). Mar 25 01:35:55.068328 kubelet[2145]: E0325 01:35:55.067877 2145 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:35:55.070400 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:35:55.070615 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:35:55.071011 systemd[1]: kubelet.service: Consumed 166ms CPU time, 102.5M memory peak. Mar 25 01:35:55.488289 sshd[2155]: Accepted publickey for core from 10.200.16.10 port 47176 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:55.489902 sshd-session[2155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:55.494306 systemd-logind[1706]: New session 7 of user core. Mar 25 01:35:55.497639 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:35:55.951288 sudo[2160]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:35:55.951649 sudo[2160]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:35:55.978962 sudo[2160]: pam_unix(sudo:session): session closed for user root Mar 25 01:35:56.092007 sshd[2159]: Connection closed by 10.200.16.10 port 47176 Mar 25 01:35:56.093221 sshd-session[2155]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:56.097934 systemd[1]: sshd@4-10.200.8.14:22-10.200.16.10:47176.service: Deactivated successfully. Mar 25 01:35:56.100080 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:35:56.101095 systemd-logind[1706]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:35:56.102040 systemd-logind[1706]: Removed session 7. Mar 25 01:35:56.207623 systemd[1]: Started sshd@5-10.200.8.14:22-10.200.16.10:47178.service - OpenSSH per-connection server daemon (10.200.16.10:47178). Mar 25 01:35:56.840966 sshd[2166]: Accepted publickey for core from 10.200.16.10 port 47178 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:56.842694 sshd-session[2166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:56.846930 systemd-logind[1706]: New session 8 of user core. Mar 25 01:35:56.853626 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:35:57.186972 sudo[2170]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:35:57.187405 sudo[2170]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:35:57.190797 sudo[2170]: pam_unix(sudo:session): session closed for user root Mar 25 01:35:57.196015 sudo[2169]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:35:57.196347 sudo[2169]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:35:57.205400 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:35:57.243338 augenrules[2192]: No rules Mar 25 01:35:57.244733 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:35:57.244998 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:35:57.246435 sudo[2169]: pam_unix(sudo:session): session closed for user root Mar 25 01:35:57.353913 sshd[2168]: Connection closed by 10.200.16.10 port 47178 Mar 25 01:35:57.354760 sshd-session[2166]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:57.358295 systemd[1]: sshd@5-10.200.8.14:22-10.200.16.10:47178.service: Deactivated successfully. Mar 25 01:35:57.360679 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:35:57.362576 systemd-logind[1706]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:35:57.363439 systemd-logind[1706]: Removed session 8. Mar 25 01:35:57.466159 systemd[1]: Started sshd@6-10.200.8.14:22-10.200.16.10:47184.service - OpenSSH per-connection server daemon (10.200.16.10:47184). Mar 25 01:35:58.099107 sshd[2201]: Accepted publickey for core from 10.200.16.10 port 47184 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:35:58.100768 sshd-session[2201]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:58.106843 systemd-logind[1706]: New session 9 of user core. Mar 25 01:35:58.112641 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:35:58.444627 sudo[2204]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:35:58.445033 sudo[2204]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:35:59.958581 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:35:59.973877 (dockerd)[2221]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:36:01.303960 dockerd[2221]: time="2025-03-25T01:36:01.303896900Z" level=info msg="Starting up" Mar 25 01:36:01.306295 dockerd[2221]: time="2025-03-25T01:36:01.306256900Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:36:01.420989 dockerd[2221]: time="2025-03-25T01:36:01.420944700Z" level=info msg="Loading containers: start." Mar 25 01:36:01.674525 kernel: Initializing XFRM netlink socket Mar 25 01:36:01.796177 systemd-networkd[1573]: docker0: Link UP Mar 25 01:36:01.863639 dockerd[2221]: time="2025-03-25T01:36:01.863595500Z" level=info msg="Loading containers: done." Mar 25 01:36:01.883583 dockerd[2221]: time="2025-03-25T01:36:01.883534100Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:36:01.883765 dockerd[2221]: time="2025-03-25T01:36:01.883632000Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:36:01.883765 dockerd[2221]: time="2025-03-25T01:36:01.883758700Z" level=info msg="Daemon has completed initialization" Mar 25 01:36:01.933169 dockerd[2221]: time="2025-03-25T01:36:01.933039700Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:36:01.933506 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:36:02.652349 containerd[1737]: time="2025-03-25T01:36:02.652298900Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\"" Mar 25 01:36:03.315079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1056039306.mount: Deactivated successfully. Mar 25 01:36:04.750216 containerd[1737]: time="2025-03-25T01:36:04.750158218Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:04.752901 containerd[1737]: time="2025-03-25T01:36:04.752827946Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.3: active requests=0, bytes read=28682438" Mar 25 01:36:04.756316 containerd[1737]: time="2025-03-25T01:36:04.756259983Z" level=info msg="ImageCreate event name:\"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:04.760455 containerd[1737]: time="2025-03-25T01:36:04.760415128Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:04.761533 containerd[1737]: time="2025-03-25T01:36:04.761325737Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.3\" with image id \"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\", size \"28679230\" in 2.108986737s" Mar 25 01:36:04.761533 containerd[1737]: time="2025-03-25T01:36:04.761369238Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\" returns image reference \"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\"" Mar 25 01:36:04.762316 containerd[1737]: time="2025-03-25T01:36:04.762134046Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\"" Mar 25 01:36:05.169246 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:36:05.172437 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:36:05.640184 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:05.649763 (kubelet)[2479]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:36:05.687387 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:36:05.940527 kubelet[2479]: E0325 01:36:05.686002 2479 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:36:05.687533 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:36:05.687876 systemd[1]: kubelet.service: Consumed 147ms CPU time, 103.9M memory peak. Mar 25 01:36:07.152552 containerd[1737]: time="2025-03-25T01:36:07.152496943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:07.154418 containerd[1737]: time="2025-03-25T01:36:07.154347763Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.3: active requests=0, bytes read=24779692" Mar 25 01:36:07.156670 containerd[1737]: time="2025-03-25T01:36:07.156620188Z" level=info msg="ImageCreate event name:\"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:07.161129 containerd[1737]: time="2025-03-25T01:36:07.161077935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:07.162110 containerd[1737]: time="2025-03-25T01:36:07.161968345Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.3\" with image id \"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\", size \"26267292\" in 2.399797799s" Mar 25 01:36:07.162110 containerd[1737]: time="2025-03-25T01:36:07.162007745Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\" returns image reference \"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\"" Mar 25 01:36:07.162886 containerd[1737]: time="2025-03-25T01:36:07.162836454Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\"" Mar 25 01:36:08.510414 containerd[1737]: time="2025-03-25T01:36:08.510356484Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:08.512304 containerd[1737]: time="2025-03-25T01:36:08.512235804Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.3: active requests=0, bytes read=19171427" Mar 25 01:36:08.515676 containerd[1737]: time="2025-03-25T01:36:08.515623141Z" level=info msg="ImageCreate event name:\"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:08.520085 containerd[1737]: time="2025-03-25T01:36:08.520026288Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:08.521018 containerd[1737]: time="2025-03-25T01:36:08.520875697Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.3\" with image id \"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\", size \"20659045\" in 1.357985242s" Mar 25 01:36:08.521018 containerd[1737]: time="2025-03-25T01:36:08.520914297Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\" returns image reference \"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\"" Mar 25 01:36:08.521777 containerd[1737]: time="2025-03-25T01:36:08.521538304Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 25 01:36:09.661685 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2880723913.mount: Deactivated successfully. Mar 25 01:36:10.247792 containerd[1737]: time="2025-03-25T01:36:10.247728289Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:10.251308 containerd[1737]: time="2025-03-25T01:36:10.251144026Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=30918193" Mar 25 01:36:10.254413 containerd[1737]: time="2025-03-25T01:36:10.254349460Z" level=info msg="ImageCreate event name:\"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:10.258839 containerd[1737]: time="2025-03-25T01:36:10.258756807Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:10.260318 containerd[1737]: time="2025-03-25T01:36:10.259887519Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"30917204\" in 1.738313015s" Mar 25 01:36:10.260318 containerd[1737]: time="2025-03-25T01:36:10.259935220Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\"" Mar 25 01:36:10.260690 containerd[1737]: time="2025-03-25T01:36:10.260663328Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Mar 25 01:36:10.811032 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1897166678.mount: Deactivated successfully. Mar 25 01:36:11.986124 containerd[1737]: time="2025-03-25T01:36:11.986066559Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:11.987987 containerd[1737]: time="2025-03-25T01:36:11.987911374Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Mar 25 01:36:11.991336 containerd[1737]: time="2025-03-25T01:36:11.991286301Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:11.996651 containerd[1737]: time="2025-03-25T01:36:11.996612843Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:11.997897 containerd[1737]: time="2025-03-25T01:36:11.997568851Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.736831123s" Mar 25 01:36:11.997897 containerd[1737]: time="2025-03-25T01:36:11.997607351Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Mar 25 01:36:11.998434 containerd[1737]: time="2025-03-25T01:36:11.998399857Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 01:36:12.531810 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3742609969.mount: Deactivated successfully. Mar 25 01:36:12.553032 containerd[1737]: time="2025-03-25T01:36:12.552933563Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:36:12.555109 containerd[1737]: time="2025-03-25T01:36:12.555032680Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 25 01:36:12.558599 containerd[1737]: time="2025-03-25T01:36:12.558542608Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:36:12.562894 containerd[1737]: time="2025-03-25T01:36:12.562839442Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:36:12.563614 containerd[1737]: time="2025-03-25T01:36:12.563452747Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 564.947989ms" Mar 25 01:36:12.563614 containerd[1737]: time="2025-03-25T01:36:12.563502047Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 25 01:36:12.564228 containerd[1737]: time="2025-03-25T01:36:12.564191853Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Mar 25 01:36:13.092355 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1757013171.mount: Deactivated successfully. Mar 25 01:36:15.915555 containerd[1737]: time="2025-03-25T01:36:15.915492680Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:15.918630 containerd[1737]: time="2025-03-25T01:36:15.918574604Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551328" Mar 25 01:36:15.918883 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 25 01:36:15.920708 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:36:15.923949 containerd[1737]: time="2025-03-25T01:36:15.923424543Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:15.928554 containerd[1737]: time="2025-03-25T01:36:15.928526784Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:15.929551 containerd[1737]: time="2025-03-25T01:36:15.929520491Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.365295938s" Mar 25 01:36:15.929633 containerd[1737]: time="2025-03-25T01:36:15.929558692Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Mar 25 01:36:16.054454 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Mar 25 01:36:16.125037 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:16.129086 (kubelet)[2627]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:36:16.164723 kubelet[2627]: E0325 01:36:16.164675 2627 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:36:16.167122 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:36:16.167332 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:36:16.167761 systemd[1]: kubelet.service: Consumed 157ms CPU time, 103.8M memory peak. Mar 25 01:36:16.488706 update_engine[1709]: I20250325 01:36:16.488616 1709 update_attempter.cc:509] Updating boot flags... Mar 25 01:36:16.762506 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2653) Mar 25 01:36:16.961916 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2652) Mar 25 01:36:18.911558 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:18.911793 systemd[1]: kubelet.service: Consumed 157ms CPU time, 103.8M memory peak. Mar 25 01:36:18.914354 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:36:18.943171 systemd[1]: Reload requested from client PID 2768 ('systemctl') (unit session-9.scope)... Mar 25 01:36:18.943365 systemd[1]: Reloading... Mar 25 01:36:19.087566 zram_generator::config[2818]: No configuration found. Mar 25 01:36:19.219042 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:36:19.334899 systemd[1]: Reloading finished in 390 ms. Mar 25 01:36:19.385881 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 25 01:36:19.385986 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 25 01:36:19.386272 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:19.386329 systemd[1]: kubelet.service: Consumed 123ms CPU time, 91.8M memory peak. Mar 25 01:36:19.389313 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:36:19.722458 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:19.731833 (kubelet)[2886]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:36:20.481006 kubelet[2886]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:36:20.481006 kubelet[2886]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 01:36:20.481006 kubelet[2886]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:36:20.481503 kubelet[2886]: I0325 01:36:20.481077 2886 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:36:20.834193 kubelet[2886]: I0325 01:36:20.834074 2886 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 01:36:20.834193 kubelet[2886]: I0325 01:36:20.834104 2886 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:36:20.834791 kubelet[2886]: I0325 01:36:20.834471 2886 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 01:36:20.859148 kubelet[2886]: E0325 01:36:20.859070 2886 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.14:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.14:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:36:20.861271 kubelet[2886]: I0325 01:36:20.861142 2886 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:36:20.874767 kubelet[2886]: I0325 01:36:20.874741 2886 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:36:20.879296 kubelet[2886]: I0325 01:36:20.879214 2886 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:36:20.880492 kubelet[2886]: I0325 01:36:20.880436 2886 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:36:20.880697 kubelet[2886]: I0325 01:36:20.880503 2886 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284.0.0-a-0ecc1f6a74","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:36:20.880843 kubelet[2886]: I0325 01:36:20.880701 2886 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:36:20.880843 kubelet[2886]: I0325 01:36:20.880714 2886 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 01:36:20.880929 kubelet[2886]: I0325 01:36:20.880860 2886 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:36:20.883787 kubelet[2886]: I0325 01:36:20.883765 2886 kubelet.go:446] "Attempting to sync node with API server" Mar 25 01:36:20.883787 kubelet[2886]: I0325 01:36:20.883789 2886 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:36:20.883911 kubelet[2886]: I0325 01:36:20.883817 2886 kubelet.go:352] "Adding apiserver pod source" Mar 25 01:36:20.883911 kubelet[2886]: I0325 01:36:20.883831 2886 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:36:20.889323 kubelet[2886]: W0325 01:36:20.889001 2886 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.14:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.14:6443: connect: connection refused Mar 25 01:36:20.889323 kubelet[2886]: E0325 01:36:20.889065 2886 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.14:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.14:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:36:20.889323 kubelet[2886]: W0325 01:36:20.889147 2886 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.14:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-a-0ecc1f6a74&limit=500&resourceVersion=0": dial tcp 10.200.8.14:6443: connect: connection refused Mar 25 01:36:20.889323 kubelet[2886]: E0325 01:36:20.889186 2886 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.14:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284.0.0-a-0ecc1f6a74&limit=500&resourceVersion=0\": dial tcp 10.200.8.14:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:36:20.889575 kubelet[2886]: I0325 01:36:20.889560 2886 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:36:20.890089 kubelet[2886]: I0325 01:36:20.890069 2886 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:36:20.890783 kubelet[2886]: W0325 01:36:20.890760 2886 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:36:20.893540 kubelet[2886]: I0325 01:36:20.893317 2886 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 01:36:20.893540 kubelet[2886]: I0325 01:36:20.893355 2886 server.go:1287] "Started kubelet" Mar 25 01:36:20.898909 kubelet[2886]: I0325 01:36:20.898788 2886 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:36:20.900165 kubelet[2886]: E0325 01:36:20.898503 2886 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.14:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.14:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284.0.0-a-0ecc1f6a74.182fe7ebd50bfe51 default 0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284.0.0-a-0ecc1f6a74,UID:ci-4284.0.0-a-0ecc1f6a74,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284.0.0-a-0ecc1f6a74,},FirstTimestamp:2025-03-25 01:36:20.893335121 +0000 UTC m=+1.158114524,LastTimestamp:2025-03-25 01:36:20.893335121 +0000 UTC m=+1.158114524,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284.0.0-a-0ecc1f6a74,}" Mar 25 01:36:20.902449 kubelet[2886]: I0325 01:36:20.902418 2886 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:36:20.904991 kubelet[2886]: I0325 01:36:20.903709 2886 server.go:490] "Adding debug handlers to kubelet server" Mar 25 01:36:20.904991 kubelet[2886]: I0325 01:36:20.904883 2886 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:36:20.906733 kubelet[2886]: I0325 01:36:20.906141 2886 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 01:36:20.906733 kubelet[2886]: E0325 01:36:20.906308 2886 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" Mar 25 01:36:20.908281 kubelet[2886]: I0325 01:36:20.906987 2886 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:36:20.908281 kubelet[2886]: I0325 01:36:20.907224 2886 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:36:20.908281 kubelet[2886]: E0325 01:36:20.907620 2886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.14:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-a-0ecc1f6a74?timeout=10s\": dial tcp 10.200.8.14:6443: connect: connection refused" interval="200ms" Mar 25 01:36:20.908281 kubelet[2886]: I0325 01:36:20.907996 2886 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:36:20.908281 kubelet[2886]: I0325 01:36:20.908084 2886 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:36:20.910217 kubelet[2886]: I0325 01:36:20.910193 2886 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:36:20.910992 kubelet[2886]: I0325 01:36:20.910975 2886 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:36:20.911126 kubelet[2886]: I0325 01:36:20.911116 2886 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:36:20.915121 kubelet[2886]: E0325 01:36:20.915099 2886 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:36:20.915790 kubelet[2886]: W0325 01:36:20.915744 2886 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.14:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.14:6443: connect: connection refused Mar 25 01:36:20.915928 kubelet[2886]: E0325 01:36:20.915907 2886 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.14:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.14:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:36:20.940159 kubelet[2886]: I0325 01:36:20.940125 2886 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:36:20.942666 kubelet[2886]: I0325 01:36:20.942643 2886 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:36:20.942800 kubelet[2886]: I0325 01:36:20.942788 2886 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 01:36:20.942922 kubelet[2886]: I0325 01:36:20.942906 2886 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 01:36:20.943010 kubelet[2886]: I0325 01:36:20.943000 2886 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 01:36:20.943117 kubelet[2886]: I0325 01:36:20.943100 2886 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 01:36:20.943117 kubelet[2886]: I0325 01:36:20.943117 2886 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 01:36:20.943213 kubelet[2886]: I0325 01:36:20.943134 2886 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:36:20.943298 kubelet[2886]: E0325 01:36:20.943280 2886 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:36:20.946790 kubelet[2886]: W0325 01:36:20.946759 2886 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.14:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.14:6443: connect: connection refused Mar 25 01:36:20.946883 kubelet[2886]: E0325 01:36:20.946805 2886 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.14:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.14:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:36:20.947463 kubelet[2886]: I0325 01:36:20.947388 2886 policy_none.go:49] "None policy: Start" Mar 25 01:36:20.947463 kubelet[2886]: I0325 01:36:20.947415 2886 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 01:36:20.947463 kubelet[2886]: I0325 01:36:20.947429 2886 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:36:20.955670 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:36:20.967399 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:36:20.970663 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:36:20.976289 kubelet[2886]: I0325 01:36:20.976265 2886 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:36:20.976544 kubelet[2886]: I0325 01:36:20.976502 2886 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:36:20.976614 kubelet[2886]: I0325 01:36:20.976539 2886 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:36:20.977223 kubelet[2886]: I0325 01:36:20.976950 2886 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:36:20.978355 kubelet[2886]: E0325 01:36:20.978334 2886 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 01:36:20.978821 kubelet[2886]: E0325 01:36:20.978799 2886 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284.0.0-a-0ecc1f6a74\" not found" Mar 25 01:36:21.055318 systemd[1]: Created slice kubepods-burstable-podcdc5291e07aeb5e8a291b59ed4f37673.slice - libcontainer container kubepods-burstable-podcdc5291e07aeb5e8a291b59ed4f37673.slice. Mar 25 01:36:21.067128 kubelet[2886]: E0325 01:36:21.066864 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.070821 systemd[1]: Created slice kubepods-burstable-podce845712960e72bb84618f529e78bc12.slice - libcontainer container kubepods-burstable-podce845712960e72bb84618f529e78bc12.slice. Mar 25 01:36:21.072964 kubelet[2886]: E0325 01:36:21.072937 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.075045 systemd[1]: Created slice kubepods-burstable-pod190811274414b5d256bda7965459ec5c.slice - libcontainer container kubepods-burstable-pod190811274414b5d256bda7965459ec5c.slice. Mar 25 01:36:21.076962 kubelet[2886]: E0325 01:36:21.076753 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.078020 kubelet[2886]: I0325 01:36:21.078002 2886 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.078359 kubelet[2886]: E0325 01:36:21.078334 2886 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.200.8.14:6443/api/v1/nodes\": dial tcp 10.200.8.14:6443: connect: connection refused" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.109112 kubelet[2886]: E0325 01:36:21.108985 2886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.14:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-a-0ecc1f6a74?timeout=10s\": dial tcp 10.200.8.14:6443: connect: connection refused" interval="400ms" Mar 25 01:36:21.112409 kubelet[2886]: I0325 01:36:21.112375 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cdc5291e07aeb5e8a291b59ed4f37673-ca-certs\") pod \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"cdc5291e07aeb5e8a291b59ed4f37673\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112690 kubelet[2886]: I0325 01:36:21.112426 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cdc5291e07aeb5e8a291b59ed4f37673-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"cdc5291e07aeb5e8a291b59ed4f37673\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112690 kubelet[2886]: I0325 01:36:21.112458 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-k8s-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112690 kubelet[2886]: I0325 01:36:21.112501 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-kubeconfig\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112690 kubelet[2886]: I0325 01:36:21.112569 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cdc5291e07aeb5e8a291b59ed4f37673-k8s-certs\") pod \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"cdc5291e07aeb5e8a291b59ed4f37673\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112690 kubelet[2886]: I0325 01:36:21.112604 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-ca-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112935 kubelet[2886]: I0325 01:36:21.112643 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-flexvolume-dir\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112935 kubelet[2886]: I0325 01:36:21.112702 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.112935 kubelet[2886]: I0325 01:36:21.112779 2886 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/190811274414b5d256bda7965459ec5c-kubeconfig\") pod \"kube-scheduler-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"190811274414b5d256bda7965459ec5c\") " pod="kube-system/kube-scheduler-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.281153 kubelet[2886]: I0325 01:36:21.281102 2886 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.281592 kubelet[2886]: E0325 01:36:21.281549 2886 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.200.8.14:6443/api/v1/nodes\": dial tcp 10.200.8.14:6443: connect: connection refused" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.368656 containerd[1737]: time="2025-03-25T01:36:21.368505272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284.0.0-a-0ecc1f6a74,Uid:cdc5291e07aeb5e8a291b59ed4f37673,Namespace:kube-system,Attempt:0,}" Mar 25 01:36:21.374205 containerd[1737]: time="2025-03-25T01:36:21.373960846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74,Uid:ce845712960e72bb84618f529e78bc12,Namespace:kube-system,Attempt:0,}" Mar 25 01:36:21.378220 containerd[1737]: time="2025-03-25T01:36:21.378188403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284.0.0-a-0ecc1f6a74,Uid:190811274414b5d256bda7965459ec5c,Namespace:kube-system,Attempt:0,}" Mar 25 01:36:21.449230 containerd[1737]: time="2025-03-25T01:36:21.449151567Z" level=info msg="connecting to shim 90f54aa57ed6470ccb9fd0a7c96dc1bca1608b5e8c422bab0178998c430bc0aa" address="unix:///run/containerd/s/a042b18d24b97c0cfbd1a582d0f2b71cfbc029126d4aa51290e9737d9db2cea0" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:21.468749 containerd[1737]: time="2025-03-25T01:36:21.468683132Z" level=info msg="connecting to shim 55999b758a657a7c8e6bc3736ffe19a7fc31d03fa128c94013187fd2d159a9da" address="unix:///run/containerd/s/bfc1c192a88a2bdac9a66c698a3b3c28687d7d055190ca2559bb83ada5a4272c" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:21.493773 systemd[1]: Started cri-containerd-90f54aa57ed6470ccb9fd0a7c96dc1bca1608b5e8c422bab0178998c430bc0aa.scope - libcontainer container 90f54aa57ed6470ccb9fd0a7c96dc1bca1608b5e8c422bab0178998c430bc0aa. Mar 25 01:36:21.498559 containerd[1737]: time="2025-03-25T01:36:21.497982430Z" level=info msg="connecting to shim f229812c2ce6a0bfca749721da199169484ef76b8e1502eb8b5169828ecc52ad" address="unix:///run/containerd/s/8b71f2b98e981a0af36a1db78b7ddda8432f48612b180d91d14faa2209acc6f9" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:21.512158 kubelet[2886]: E0325 01:36:21.510138 2886 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.14:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284.0.0-a-0ecc1f6a74?timeout=10s\": dial tcp 10.200.8.14:6443: connect: connection refused" interval="800ms" Mar 25 01:36:21.532681 systemd[1]: Started cri-containerd-55999b758a657a7c8e6bc3736ffe19a7fc31d03fa128c94013187fd2d159a9da.scope - libcontainer container 55999b758a657a7c8e6bc3736ffe19a7fc31d03fa128c94013187fd2d159a9da. Mar 25 01:36:21.536495 systemd[1]: Started cri-containerd-f229812c2ce6a0bfca749721da199169484ef76b8e1502eb8b5169828ecc52ad.scope - libcontainer container f229812c2ce6a0bfca749721da199169484ef76b8e1502eb8b5169828ecc52ad. Mar 25 01:36:21.600110 containerd[1737]: time="2025-03-25T01:36:21.600042715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74,Uid:ce845712960e72bb84618f529e78bc12,Namespace:kube-system,Attempt:0,} returns sandbox id \"55999b758a657a7c8e6bc3736ffe19a7fc31d03fa128c94013187fd2d159a9da\"" Mar 25 01:36:21.614055 containerd[1737]: time="2025-03-25T01:36:21.613990204Z" level=info msg="CreateContainer within sandbox \"55999b758a657a7c8e6bc3736ffe19a7fc31d03fa128c94013187fd2d159a9da\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:36:21.621000 containerd[1737]: time="2025-03-25T01:36:21.620699496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284.0.0-a-0ecc1f6a74,Uid:cdc5291e07aeb5e8a291b59ed4f37673,Namespace:kube-system,Attempt:0,} returns sandbox id \"90f54aa57ed6470ccb9fd0a7c96dc1bca1608b5e8c422bab0178998c430bc0aa\"" Mar 25 01:36:21.626878 containerd[1737]: time="2025-03-25T01:36:21.626021668Z" level=info msg="CreateContainer within sandbox \"90f54aa57ed6470ccb9fd0a7c96dc1bca1608b5e8c422bab0178998c430bc0aa\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:36:21.640894 containerd[1737]: time="2025-03-25T01:36:21.640868869Z" level=info msg="Container 5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:21.643631 containerd[1737]: time="2025-03-25T01:36:21.643607207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284.0.0-a-0ecc1f6a74,Uid:190811274414b5d256bda7965459ec5c,Namespace:kube-system,Attempt:0,} returns sandbox id \"f229812c2ce6a0bfca749721da199169484ef76b8e1502eb8b5169828ecc52ad\"" Mar 25 01:36:21.645933 containerd[1737]: time="2025-03-25T01:36:21.645900438Z" level=info msg="CreateContainer within sandbox \"f229812c2ce6a0bfca749721da199169484ef76b8e1502eb8b5169828ecc52ad\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:36:21.672664 containerd[1737]: time="2025-03-25T01:36:21.672372897Z" level=info msg="Container be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:21.683703 kubelet[2886]: I0325 01:36:21.683672 2886 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.684060 kubelet[2886]: E0325 01:36:21.684030 2886 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.200.8.14:6443/api/v1/nodes\": dial tcp 10.200.8.14:6443: connect: connection refused" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.692340 containerd[1737]: time="2025-03-25T01:36:21.692294868Z" level=info msg="CreateContainer within sandbox \"55999b758a657a7c8e6bc3736ffe19a7fc31d03fa128c94013187fd2d159a9da\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2\"" Mar 25 01:36:21.692932 containerd[1737]: time="2025-03-25T01:36:21.692901676Z" level=info msg="StartContainer for \"5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2\"" Mar 25 01:36:21.694118 containerd[1737]: time="2025-03-25T01:36:21.693898089Z" level=info msg="connecting to shim 5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2" address="unix:///run/containerd/s/bfc1c192a88a2bdac9a66c698a3b3c28687d7d055190ca2559bb83ada5a4272c" protocol=ttrpc version=3 Mar 25 01:36:21.694821 containerd[1737]: time="2025-03-25T01:36:21.694795002Z" level=info msg="Container c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:21.703607 containerd[1737]: time="2025-03-25T01:36:21.703548120Z" level=info msg="CreateContainer within sandbox \"90f54aa57ed6470ccb9fd0a7c96dc1bca1608b5e8c422bab0178998c430bc0aa\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462\"" Mar 25 01:36:21.704470 containerd[1737]: time="2025-03-25T01:36:21.704446133Z" level=info msg="StartContainer for \"be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462\"" Mar 25 01:36:21.708158 containerd[1737]: time="2025-03-25T01:36:21.707597075Z" level=info msg="connecting to shim be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462" address="unix:///run/containerd/s/a042b18d24b97c0cfbd1a582d0f2b71cfbc029126d4aa51290e9737d9db2cea0" protocol=ttrpc version=3 Mar 25 01:36:21.710724 containerd[1737]: time="2025-03-25T01:36:21.710693417Z" level=info msg="CreateContainer within sandbox \"f229812c2ce6a0bfca749721da199169484ef76b8e1502eb8b5169828ecc52ad\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70\"" Mar 25 01:36:21.711364 containerd[1737]: time="2025-03-25T01:36:21.711340626Z" level=info msg="StartContainer for \"c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70\"" Mar 25 01:36:21.714232 containerd[1737]: time="2025-03-25T01:36:21.714202165Z" level=info msg="connecting to shim c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70" address="unix:///run/containerd/s/8b71f2b98e981a0af36a1db78b7ddda8432f48612b180d91d14faa2209acc6f9" protocol=ttrpc version=3 Mar 25 01:36:21.717672 systemd[1]: Started cri-containerd-5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2.scope - libcontainer container 5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2. Mar 25 01:36:21.748649 systemd[1]: Started cri-containerd-be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462.scope - libcontainer container be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462. Mar 25 01:36:21.750860 systemd[1]: Started cri-containerd-c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70.scope - libcontainer container c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70. Mar 25 01:36:21.810766 containerd[1737]: time="2025-03-25T01:36:21.810622674Z" level=info msg="StartContainer for \"5dc079d774cc60b5058745fdf073cca0b1eaae026fc527d870dadc4f577201e2\" returns successfully" Mar 25 01:36:21.857666 containerd[1737]: time="2025-03-25T01:36:21.857619412Z" level=info msg="StartContainer for \"be47deefee391f6adef23cda5cde86ab96f8e215679afe3c33a5d1c00b120462\" returns successfully" Mar 25 01:36:21.885524 containerd[1737]: time="2025-03-25T01:36:21.885381289Z" level=info msg="StartContainer for \"c44ee5eec2597fa0b6b687d8f9b4b469774ffa41679ca3d728f0e91ff6c8ad70\" returns successfully" Mar 25 01:36:21.956140 kubelet[2886]: E0325 01:36:21.956106 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.958419 kubelet[2886]: E0325 01:36:21.958382 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:21.963541 kubelet[2886]: E0325 01:36:21.963516 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:22.486466 kubelet[2886]: I0325 01:36:22.486429 2886 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:22.967465 kubelet[2886]: E0325 01:36:22.967417 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:22.967917 kubelet[2886]: E0325 01:36:22.967902 2886 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:24.991417 kubelet[2886]: I0325 01:36:24.991374 2886 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:24.991417 kubelet[2886]: E0325 01:36:24.991412 2886 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"ci-4284.0.0-a-0ecc1f6a74\": node \"ci-4284.0.0-a-0ecc1f6a74\" not found" Mar 25 01:36:25.006932 kubelet[2886]: I0325 01:36:25.006829 2886 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:25.018430 kubelet[2886]: W0325 01:36:25.017403 2886 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:36:25.018430 kubelet[2886]: I0325 01:36:25.017555 2886 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:25.031032 kubelet[2886]: W0325 01:36:25.030995 2886 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:36:25.031198 kubelet[2886]: I0325 01:36:25.031107 2886 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:25.039456 kubelet[2886]: W0325 01:36:25.039425 2886 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:36:25.988171 kubelet[2886]: I0325 01:36:25.987840 2886 apiserver.go:52] "Watching apiserver" Mar 25 01:36:26.011819 kubelet[2886]: I0325 01:36:26.011780 2886 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:36:27.412434 systemd[1]: Reload requested from client PID 3154 ('systemctl') (unit session-9.scope)... Mar 25 01:36:27.412451 systemd[1]: Reloading... Mar 25 01:36:27.529514 zram_generator::config[3207]: No configuration found. Mar 25 01:36:27.650655 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:36:27.787905 systemd[1]: Reloading finished in 374 ms. Mar 25 01:36:27.819879 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:36:27.820520 kubelet[2886]: I0325 01:36:27.819877 2886 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:36:27.840012 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:36:27.840287 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:27.840346 systemd[1]: kubelet.service: Consumed 904ms CPU time, 127.8M memory peak. Mar 25 01:36:27.844922 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:36:27.989160 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:36:27.996880 (kubelet)[3269]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:36:28.046186 kubelet[3269]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:36:28.046186 kubelet[3269]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 01:36:28.046186 kubelet[3269]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:36:28.047151 kubelet[3269]: I0325 01:36:28.046765 3269 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:36:28.053659 kubelet[3269]: I0325 01:36:28.053635 3269 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 01:36:28.053659 kubelet[3269]: I0325 01:36:28.053658 3269 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:36:28.053930 kubelet[3269]: I0325 01:36:28.053908 3269 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 01:36:28.055068 kubelet[3269]: I0325 01:36:28.055045 3269 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:36:28.057181 kubelet[3269]: I0325 01:36:28.057155 3269 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:36:28.061101 kubelet[3269]: I0325 01:36:28.061081 3269 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:36:28.066382 kubelet[3269]: I0325 01:36:28.066316 3269 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:36:28.066962 kubelet[3269]: I0325 01:36:28.066915 3269 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:36:28.067225 kubelet[3269]: I0325 01:36:28.066965 3269 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284.0.0-a-0ecc1f6a74","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:36:28.067225 kubelet[3269]: I0325 01:36:28.067170 3269 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:36:28.067225 kubelet[3269]: I0325 01:36:28.067184 3269 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 01:36:28.067428 kubelet[3269]: I0325 01:36:28.067230 3269 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:36:28.067572 kubelet[3269]: I0325 01:36:28.067556 3269 kubelet.go:446] "Attempting to sync node with API server" Mar 25 01:36:28.069457 kubelet[3269]: I0325 01:36:28.067575 3269 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:36:28.069457 kubelet[3269]: I0325 01:36:28.067600 3269 kubelet.go:352] "Adding apiserver pod source" Mar 25 01:36:28.069457 kubelet[3269]: I0325 01:36:28.067613 3269 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:36:28.073499 kubelet[3269]: I0325 01:36:28.071830 3269 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:36:28.073499 kubelet[3269]: I0325 01:36:28.072288 3269 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:36:28.073499 kubelet[3269]: I0325 01:36:28.072759 3269 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 01:36:28.073499 kubelet[3269]: I0325 01:36:28.072790 3269 server.go:1287] "Started kubelet" Mar 25 01:36:28.076814 kubelet[3269]: I0325 01:36:28.076798 3269 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:36:28.083321 kubelet[3269]: I0325 01:36:28.083292 3269 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:36:28.084987 kubelet[3269]: I0325 01:36:28.084971 3269 server.go:490] "Adding debug handlers to kubelet server" Mar 25 01:36:28.088502 kubelet[3269]: I0325 01:36:28.087039 3269 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:36:28.088804 kubelet[3269]: I0325 01:36:28.088787 3269 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:36:28.088921 kubelet[3269]: I0325 01:36:28.088835 3269 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:36:28.091498 kubelet[3269]: I0325 01:36:28.089071 3269 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 01:36:28.091850 kubelet[3269]: E0325 01:36:28.091830 3269 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284.0.0-a-0ecc1f6a74\" not found" Mar 25 01:36:28.095656 kubelet[3269]: I0325 01:36:28.095634 3269 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:36:28.095907 kubelet[3269]: I0325 01:36:28.095893 3269 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:36:28.101277 kubelet[3269]: I0325 01:36:28.101253 3269 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:36:28.102895 kubelet[3269]: I0325 01:36:28.102875 3269 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:36:28.103015 kubelet[3269]: I0325 01:36:28.103005 3269 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 01:36:28.103089 kubelet[3269]: I0325 01:36:28.103080 3269 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 01:36:28.103141 kubelet[3269]: I0325 01:36:28.103135 3269 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 01:36:28.103245 kubelet[3269]: E0325 01:36:28.103230 3269 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:36:28.112147 kubelet[3269]: I0325 01:36:28.112125 3269 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:36:28.112541 kubelet[3269]: I0325 01:36:28.112517 3269 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:36:28.118463 kubelet[3269]: E0325 01:36:28.116516 3269 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:36:28.118592 kubelet[3269]: I0325 01:36:28.117192 3269 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:36:28.160722 kubelet[3269]: I0325 01:36:28.160692 3269 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 01:36:28.160722 kubelet[3269]: I0325 01:36:28.160709 3269 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 01:36:28.160722 kubelet[3269]: I0325 01:36:28.160732 3269 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:36:28.161043 kubelet[3269]: I0325 01:36:28.160961 3269 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:36:28.161043 kubelet[3269]: I0325 01:36:28.160980 3269 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:36:28.161043 kubelet[3269]: I0325 01:36:28.161015 3269 policy_none.go:49] "None policy: Start" Mar 25 01:36:28.161043 kubelet[3269]: I0325 01:36:28.161046 3269 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 01:36:28.161303 kubelet[3269]: I0325 01:36:28.161062 3269 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:36:28.161303 kubelet[3269]: I0325 01:36:28.161201 3269 state_mem.go:75] "Updated machine memory state" Mar 25 01:36:28.165760 kubelet[3269]: I0325 01:36:28.165730 3269 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:36:28.166511 kubelet[3269]: I0325 01:36:28.165895 3269 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:36:28.166511 kubelet[3269]: I0325 01:36:28.165909 3269 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:36:28.166511 kubelet[3269]: I0325 01:36:28.166295 3269 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:36:28.167958 kubelet[3269]: E0325 01:36:28.167759 3269 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 01:36:28.204153 kubelet[3269]: I0325 01:36:28.204120 3269 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.204333 kubelet[3269]: I0325 01:36:28.204314 3269 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.204615 kubelet[3269]: I0325 01:36:28.204179 3269 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.217536 kubelet[3269]: W0325 01:36:28.217509 3269 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:36:28.217536 kubelet[3269]: E0325 01:36:28.217581 3269 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284.0.0-a-0ecc1f6a74\" already exists" pod="kube-system/kube-scheduler-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.218155 kubelet[3269]: W0325 01:36:28.218137 3269 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:36:28.218294 kubelet[3269]: E0325 01:36:28.218209 3269 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" already exists" pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.218294 kubelet[3269]: W0325 01:36:28.218246 3269 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:36:28.218294 kubelet[3269]: E0325 01:36:28.218276 3269 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" already exists" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.269673 kubelet[3269]: I0325 01:36:28.269631 3269 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.279060 kubelet[3269]: I0325 01:36:28.279027 3269 kubelet_node_status.go:125] "Node was previously registered" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.279208 kubelet[3269]: I0325 01:36:28.279099 3269 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397364 kubelet[3269]: I0325 01:36:28.397222 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-kubeconfig\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397364 kubelet[3269]: I0325 01:36:28.397260 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/190811274414b5d256bda7965459ec5c-kubeconfig\") pod \"kube-scheduler-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"190811274414b5d256bda7965459ec5c\") " pod="kube-system/kube-scheduler-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397364 kubelet[3269]: I0325 01:36:28.397284 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cdc5291e07aeb5e8a291b59ed4f37673-ca-certs\") pod \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"cdc5291e07aeb5e8a291b59ed4f37673\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397364 kubelet[3269]: I0325 01:36:28.397307 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cdc5291e07aeb5e8a291b59ed4f37673-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"cdc5291e07aeb5e8a291b59ed4f37673\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397364 kubelet[3269]: I0325 01:36:28.397337 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-ca-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397820 kubelet[3269]: I0325 01:36:28.397359 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-flexvolume-dir\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397820 kubelet[3269]: I0325 01:36:28.397392 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cdc5291e07aeb5e8a291b59ed4f37673-k8s-certs\") pod \"kube-apiserver-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"cdc5291e07aeb5e8a291b59ed4f37673\") " pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397820 kubelet[3269]: I0325 01:36:28.397425 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-k8s-certs\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:28.397820 kubelet[3269]: I0325 01:36:28.397448 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ce845712960e72bb84618f529e78bc12-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74\" (UID: \"ce845712960e72bb84618f529e78bc12\") " pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:36:29.068531 kubelet[3269]: I0325 01:36:29.068470 3269 apiserver.go:52] "Watching apiserver" Mar 25 01:36:29.096563 kubelet[3269]: I0325 01:36:29.096463 3269 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:36:29.165579 kubelet[3269]: I0325 01:36:29.165439 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284.0.0-a-0ecc1f6a74" podStartSLOduration=4.165398527 podStartE2EDuration="4.165398527s" podCreationTimestamp="2025-03-25 01:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:36:29.165173824 +0000 UTC m=+1.163591275" watchObservedRunningTime="2025-03-25 01:36:29.165398527 +0000 UTC m=+1.163815978" Mar 25 01:36:29.185350 kubelet[3269]: I0325 01:36:29.185273 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284.0.0-a-0ecc1f6a74" podStartSLOduration=4.1852528 podStartE2EDuration="4.1852528s" podCreationTimestamp="2025-03-25 01:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:36:29.174867857 +0000 UTC m=+1.173285308" watchObservedRunningTime="2025-03-25 01:36:29.1852528 +0000 UTC m=+1.183670251" Mar 25 01:36:29.200960 kubelet[3269]: I0325 01:36:29.200900 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284.0.0-a-0ecc1f6a74" podStartSLOduration=4.200881014 podStartE2EDuration="4.200881014s" podCreationTimestamp="2025-03-25 01:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:36:29.185932409 +0000 UTC m=+1.184349960" watchObservedRunningTime="2025-03-25 01:36:29.200881014 +0000 UTC m=+1.199298565" Mar 25 01:36:32.447030 kubelet[3269]: I0325 01:36:32.446992 3269 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:36:32.448124 kubelet[3269]: I0325 01:36:32.447650 3269 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:36:32.448234 containerd[1737]: time="2025-03-25T01:36:32.447385384Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:36:33.416104 systemd[1]: Created slice kubepods-besteffort-pod74f081b4_acdb_4ac5_a432_b67fc970ff70.slice - libcontainer container kubepods-besteffort-pod74f081b4_acdb_4ac5_a432_b67fc970ff70.slice. Mar 25 01:36:33.429650 kubelet[3269]: I0325 01:36:33.429574 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zt9g\" (UniqueName: \"kubernetes.io/projected/74f081b4-acdb-4ac5-a432-b67fc970ff70-kube-api-access-2zt9g\") pod \"kube-proxy-cb2wm\" (UID: \"74f081b4-acdb-4ac5-a432-b67fc970ff70\") " pod="kube-system/kube-proxy-cb2wm" Mar 25 01:36:33.429650 kubelet[3269]: I0325 01:36:33.429632 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/74f081b4-acdb-4ac5-a432-b67fc970ff70-kube-proxy\") pod \"kube-proxy-cb2wm\" (UID: \"74f081b4-acdb-4ac5-a432-b67fc970ff70\") " pod="kube-system/kube-proxy-cb2wm" Mar 25 01:36:33.429823 kubelet[3269]: I0325 01:36:33.429658 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/74f081b4-acdb-4ac5-a432-b67fc970ff70-xtables-lock\") pod \"kube-proxy-cb2wm\" (UID: \"74f081b4-acdb-4ac5-a432-b67fc970ff70\") " pod="kube-system/kube-proxy-cb2wm" Mar 25 01:36:33.429823 kubelet[3269]: I0325 01:36:33.429678 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/74f081b4-acdb-4ac5-a432-b67fc970ff70-lib-modules\") pod \"kube-proxy-cb2wm\" (UID: \"74f081b4-acdb-4ac5-a432-b67fc970ff70\") " pod="kube-system/kube-proxy-cb2wm" Mar 25 01:36:33.544533 systemd[1]: Created slice kubepods-besteffort-podc7415fb1_52d5_4d8b_b562_773043b6a0d4.slice - libcontainer container kubepods-besteffort-podc7415fb1_52d5_4d8b_b562_773043b6a0d4.slice. Mar 25 01:36:33.631974 kubelet[3269]: I0325 01:36:33.631920 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfjrw\" (UniqueName: \"kubernetes.io/projected/c7415fb1-52d5-4d8b-b562-773043b6a0d4-kube-api-access-gfjrw\") pod \"tigera-operator-ccfc44587-gtwv6\" (UID: \"c7415fb1-52d5-4d8b-b562-773043b6a0d4\") " pod="tigera-operator/tigera-operator-ccfc44587-gtwv6" Mar 25 01:36:33.632725 kubelet[3269]: I0325 01:36:33.632615 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c7415fb1-52d5-4d8b-b562-773043b6a0d4-var-lib-calico\") pod \"tigera-operator-ccfc44587-gtwv6\" (UID: \"c7415fb1-52d5-4d8b-b562-773043b6a0d4\") " pod="tigera-operator/tigera-operator-ccfc44587-gtwv6" Mar 25 01:36:33.724295 containerd[1737]: time="2025-03-25T01:36:33.724026618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cb2wm,Uid:74f081b4-acdb-4ac5-a432-b67fc970ff70,Namespace:kube-system,Attempt:0,}" Mar 25 01:36:33.777541 containerd[1737]: time="2025-03-25T01:36:33.776628499Z" level=info msg="connecting to shim 1382abafb7ecf23368d5f8fe6385121bfc93258ccf254be401e9e4b982acd043" address="unix:///run/containerd/s/528600f94a5ddc810b160948994f7675c8c885ecf383f73c14b7a12fedd01f40" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:33.812669 systemd[1]: Started cri-containerd-1382abafb7ecf23368d5f8fe6385121bfc93258ccf254be401e9e4b982acd043.scope - libcontainer container 1382abafb7ecf23368d5f8fe6385121bfc93258ccf254be401e9e4b982acd043. Mar 25 01:36:33.850015 containerd[1737]: time="2025-03-25T01:36:33.849967908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-gtwv6,Uid:c7415fb1-52d5-4d8b-b562-773043b6a0d4,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:36:33.860223 sudo[2204]: pam_unix(sudo:session): session closed for user root Mar 25 01:36:33.869692 containerd[1737]: time="2025-03-25T01:36:33.869652125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cb2wm,Uid:74f081b4-acdb-4ac5-a432-b67fc970ff70,Namespace:kube-system,Attempt:0,} returns sandbox id \"1382abafb7ecf23368d5f8fe6385121bfc93258ccf254be401e9e4b982acd043\"" Mar 25 01:36:33.872941 containerd[1737]: time="2025-03-25T01:36:33.872903561Z" level=info msg="CreateContainer within sandbox \"1382abafb7ecf23368d5f8fe6385121bfc93258ccf254be401e9e4b982acd043\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:36:33.910365 containerd[1737]: time="2025-03-25T01:36:33.908769157Z" level=info msg="Container 2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:33.924367 containerd[1737]: time="2025-03-25T01:36:33.924325129Z" level=info msg="connecting to shim 140eff08a34e011d9654befaafc37aa37c8713b7bf4184d1606e37086ea5f27d" address="unix:///run/containerd/s/f0183895b8f3daa64e33f6cbe874d3c35732912e5257834ef485fe7c3950a220" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:33.927680 containerd[1737]: time="2025-03-25T01:36:33.927612265Z" level=info msg="CreateContainer within sandbox \"1382abafb7ecf23368d5f8fe6385121bfc93258ccf254be401e9e4b982acd043\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6\"" Mar 25 01:36:33.929714 containerd[1737]: time="2025-03-25T01:36:33.929461486Z" level=info msg="StartContainer for \"2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6\"" Mar 25 01:36:33.932486 containerd[1737]: time="2025-03-25T01:36:33.932435418Z" level=info msg="connecting to shim 2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6" address="unix:///run/containerd/s/528600f94a5ddc810b160948994f7675c8c885ecf383f73c14b7a12fedd01f40" protocol=ttrpc version=3 Mar 25 01:36:33.951637 systemd[1]: Started cri-containerd-140eff08a34e011d9654befaafc37aa37c8713b7bf4184d1606e37086ea5f27d.scope - libcontainer container 140eff08a34e011d9654befaafc37aa37c8713b7bf4184d1606e37086ea5f27d. Mar 25 01:36:33.956911 systemd[1]: Started cri-containerd-2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6.scope - libcontainer container 2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6. Mar 25 01:36:33.972883 sshd[2203]: Connection closed by 10.200.16.10 port 47184 Mar 25 01:36:33.973474 sshd-session[2201]: pam_unix(sshd:session): session closed for user core Mar 25 01:36:33.979023 systemd-logind[1706]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:36:33.980319 systemd[1]: sshd@6-10.200.8.14:22-10.200.16.10:47184.service: Deactivated successfully. Mar 25 01:36:33.985173 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:36:33.985724 systemd[1]: session-9.scope: Consumed 4.337s CPU time, 228.1M memory peak. Mar 25 01:36:33.991196 systemd-logind[1706]: Removed session 9. Mar 25 01:36:34.037873 containerd[1737]: time="2025-03-25T01:36:34.037827382Z" level=info msg="StartContainer for \"2b449ffe94db55ab113204ee38d9ad764bbb820369e721dd1f54027bfe62d7a6\" returns successfully" Mar 25 01:36:34.042026 containerd[1737]: time="2025-03-25T01:36:34.041957127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-gtwv6,Uid:c7415fb1-52d5-4d8b-b562-773043b6a0d4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"140eff08a34e011d9654befaafc37aa37c8713b7bf4184d1606e37086ea5f27d\"" Mar 25 01:36:34.044097 containerd[1737]: time="2025-03-25T01:36:34.044038650Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:36:34.177020 kubelet[3269]: I0325 01:36:34.176938 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cb2wm" podStartSLOduration=1.176916517 podStartE2EDuration="1.176916517s" podCreationTimestamp="2025-03-25 01:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:36:34.176332911 +0000 UTC m=+6.174750462" watchObservedRunningTime="2025-03-25 01:36:34.176916517 +0000 UTC m=+6.175333968" Mar 25 01:36:36.007862 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount930777808.mount: Deactivated successfully. Mar 25 01:36:36.902220 containerd[1737]: time="2025-03-25T01:36:36.902160622Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:36.904837 containerd[1737]: time="2025-03-25T01:36:36.904685150Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 25 01:36:36.908276 containerd[1737]: time="2025-03-25T01:36:36.908134888Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:36.912951 containerd[1737]: time="2025-03-25T01:36:36.912898641Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:36.913536 containerd[1737]: time="2025-03-25T01:36:36.913499348Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 2.869404496s" Mar 25 01:36:36.913614 containerd[1737]: time="2025-03-25T01:36:36.913537748Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 25 01:36:36.916608 containerd[1737]: time="2025-03-25T01:36:36.915891974Z" level=info msg="CreateContainer within sandbox \"140eff08a34e011d9654befaafc37aa37c8713b7bf4184d1606e37086ea5f27d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:36:36.937745 containerd[1737]: time="2025-03-25T01:36:36.937714615Z" level=info msg="Container 504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:36.943666 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount437871050.mount: Deactivated successfully. Mar 25 01:36:36.952207 containerd[1737]: time="2025-03-25T01:36:36.952171275Z" level=info msg="CreateContainer within sandbox \"140eff08a34e011d9654befaafc37aa37c8713b7bf4184d1606e37086ea5f27d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671\"" Mar 25 01:36:36.957509 containerd[1737]: time="2025-03-25T01:36:36.956685325Z" level=info msg="StartContainer for \"504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671\"" Mar 25 01:36:36.957765 containerd[1737]: time="2025-03-25T01:36:36.957733637Z" level=info msg="connecting to shim 504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671" address="unix:///run/containerd/s/f0183895b8f3daa64e33f6cbe874d3c35732912e5257834ef485fe7c3950a220" protocol=ttrpc version=3 Mar 25 01:36:36.983710 systemd[1]: Started cri-containerd-504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671.scope - libcontainer container 504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671. Mar 25 01:36:37.013206 containerd[1737]: time="2025-03-25T01:36:37.013164949Z" level=info msg="StartContainer for \"504d6e5c4974e7f4c5cde4530381937bf85962a13a6edce53a7fd1a83b89a671\" returns successfully" Mar 25 01:36:37.192832 kubelet[3269]: I0325 01:36:37.192760 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-ccfc44587-gtwv6" podStartSLOduration=1.321696918 podStartE2EDuration="4.192738533s" podCreationTimestamp="2025-03-25 01:36:33 +0000 UTC" firstStartedPulling="2025-03-25 01:36:34.043405943 +0000 UTC m=+6.041823494" lastFinishedPulling="2025-03-25 01:36:36.914447558 +0000 UTC m=+8.912865109" observedRunningTime="2025-03-25 01:36:37.192396229 +0000 UTC m=+9.190813780" watchObservedRunningTime="2025-03-25 01:36:37.192738533 +0000 UTC m=+9.191155984" Mar 25 01:36:40.151460 systemd[1]: Created slice kubepods-besteffort-pod1b06ac7e_e600_4623_bdd8_0a72a5d7bc6a.slice - libcontainer container kubepods-besteffort-pod1b06ac7e_e600_4623_bdd8_0a72a5d7bc6a.slice. Mar 25 01:36:40.173577 kubelet[3269]: I0325 01:36:40.173524 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a-tigera-ca-bundle\") pod \"calico-typha-546fcb4b79-hwnnz\" (UID: \"1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a\") " pod="calico-system/calico-typha-546fcb4b79-hwnnz" Mar 25 01:36:40.173577 kubelet[3269]: I0325 01:36:40.173582 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmvtd\" (UniqueName: \"kubernetes.io/projected/1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a-kube-api-access-vmvtd\") pod \"calico-typha-546fcb4b79-hwnnz\" (UID: \"1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a\") " pod="calico-system/calico-typha-546fcb4b79-hwnnz" Mar 25 01:36:40.174138 kubelet[3269]: I0325 01:36:40.173606 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a-typha-certs\") pod \"calico-typha-546fcb4b79-hwnnz\" (UID: \"1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a\") " pod="calico-system/calico-typha-546fcb4b79-hwnnz" Mar 25 01:36:40.256653 systemd[1]: Created slice kubepods-besteffort-podaccbb724_6515_4c25_bcfd_7ebc312cd5cc.slice - libcontainer container kubepods-besteffort-podaccbb724_6515_4c25_bcfd_7ebc312cd5cc.slice. Mar 25 01:36:40.274045 kubelet[3269]: I0325 01:36:40.274002 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-cni-net-dir\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274218 kubelet[3269]: I0325 01:36:40.274089 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-flexvol-driver-host\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274218 kubelet[3269]: I0325 01:36:40.274115 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/accbb724-6515-4c25-bcfd-7ebc312cd5cc-tigera-ca-bundle\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274218 kubelet[3269]: I0325 01:36:40.274139 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-xtables-lock\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274218 kubelet[3269]: I0325 01:36:40.274163 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-lib-modules\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274218 kubelet[3269]: I0325 01:36:40.274184 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bb5\" (UniqueName: \"kubernetes.io/projected/accbb724-6515-4c25-bcfd-7ebc312cd5cc-kube-api-access-j6bb5\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274417 kubelet[3269]: I0325 01:36:40.274217 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/accbb724-6515-4c25-bcfd-7ebc312cd5cc-node-certs\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274417 kubelet[3269]: I0325 01:36:40.274243 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-var-run-calico\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274417 kubelet[3269]: I0325 01:36:40.274281 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-var-lib-calico\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274417 kubelet[3269]: I0325 01:36:40.274304 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-cni-bin-dir\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274417 kubelet[3269]: I0325 01:36:40.274337 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-policysync\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.274621 kubelet[3269]: I0325 01:36:40.274358 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/accbb724-6515-4c25-bcfd-7ebc312cd5cc-cni-log-dir\") pod \"calico-node-ltctp\" (UID: \"accbb724-6515-4c25-bcfd-7ebc312cd5cc\") " pod="calico-system/calico-node-ltctp" Mar 25 01:36:40.378325 kubelet[3269]: E0325 01:36:40.378269 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.378325 kubelet[3269]: W0325 01:36:40.378314 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.378755 kubelet[3269]: E0325 01:36:40.378347 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.378755 kubelet[3269]: E0325 01:36:40.378671 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.378755 kubelet[3269]: W0325 01:36:40.378686 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.378958 kubelet[3269]: E0325 01:36:40.378935 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.379252 kubelet[3269]: E0325 01:36:40.379231 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.379252 kubelet[3269]: W0325 01:36:40.379251 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.379387 kubelet[3269]: E0325 01:36:40.379271 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.379807 kubelet[3269]: E0325 01:36:40.379786 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.379807 kubelet[3269]: W0325 01:36:40.379805 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.380065 kubelet[3269]: E0325 01:36:40.379946 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.380810 kubelet[3269]: E0325 01:36:40.380790 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.380810 kubelet[3269]: W0325 01:36:40.380808 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.381635 kubelet[3269]: E0325 01:36:40.381043 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.381635 kubelet[3269]: W0325 01:36:40.381058 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.381635 kubelet[3269]: E0325 01:36:40.381174 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.381635 kubelet[3269]: E0325 01:36:40.381200 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.381969 kubelet[3269]: E0325 01:36:40.381674 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.381969 kubelet[3269]: W0325 01:36:40.381687 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.381969 kubelet[3269]: E0325 01:36:40.381779 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.381969 kubelet[3269]: E0325 01:36:40.381960 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.382141 kubelet[3269]: W0325 01:36:40.381973 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.382141 kubelet[3269]: E0325 01:36:40.382075 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.382956 kubelet[3269]: E0325 01:36:40.382936 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.382956 kubelet[3269]: W0325 01:36:40.382955 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.383092 kubelet[3269]: E0325 01:36:40.383067 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.383220 kubelet[3269]: E0325 01:36:40.383205 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.383294 kubelet[3269]: W0325 01:36:40.383220 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.383353 kubelet[3269]: E0325 01:36:40.383308 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.384223 kubelet[3269]: E0325 01:36:40.384203 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.384223 kubelet[3269]: W0325 01:36:40.384219 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.384545 kubelet[3269]: E0325 01:36:40.384331 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.384545 kubelet[3269]: E0325 01:36:40.384464 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.384545 kubelet[3269]: W0325 01:36:40.384487 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.384978 kubelet[3269]: E0325 01:36:40.384581 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.384978 kubelet[3269]: E0325 01:36:40.384705 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.384978 kubelet[3269]: W0325 01:36:40.384715 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.384978 kubelet[3269]: E0325 01:36:40.384804 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.384978 kubelet[3269]: E0325 01:36:40.384939 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.384978 kubelet[3269]: W0325 01:36:40.384949 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.385794 kubelet[3269]: E0325 01:36:40.385046 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.385794 kubelet[3269]: E0325 01:36:40.385181 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.385794 kubelet[3269]: W0325 01:36:40.385191 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.385794 kubelet[3269]: E0325 01:36:40.385208 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.385794 kubelet[3269]: E0325 01:36:40.385519 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.385794 kubelet[3269]: W0325 01:36:40.385532 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.385794 kubelet[3269]: E0325 01:36:40.385558 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.385794 kubelet[3269]: E0325 01:36:40.385766 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.385794 kubelet[3269]: W0325 01:36:40.385777 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.386140 kubelet[3269]: E0325 01:36:40.385801 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.386869 kubelet[3269]: E0325 01:36:40.386561 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.386869 kubelet[3269]: W0325 01:36:40.386580 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.386869 kubelet[3269]: E0325 01:36:40.386674 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.386869 kubelet[3269]: E0325 01:36:40.386842 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.386869 kubelet[3269]: W0325 01:36:40.386852 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.390655 kubelet[3269]: E0325 01:36:40.386973 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.390655 kubelet[3269]: E0325 01:36:40.387113 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.390655 kubelet[3269]: W0325 01:36:40.387121 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.390655 kubelet[3269]: E0325 01:36:40.387141 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.390655 kubelet[3269]: E0325 01:36:40.387626 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.390655 kubelet[3269]: W0325 01:36:40.387639 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.390655 kubelet[3269]: E0325 01:36:40.387652 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.394260 kubelet[3269]: E0325 01:36:40.394187 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.394260 kubelet[3269]: W0325 01:36:40.394203 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.394260 kubelet[3269]: E0325 01:36:40.394217 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.408210 kubelet[3269]: E0325 01:36:40.406655 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.408210 kubelet[3269]: W0325 01:36:40.406671 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.408210 kubelet[3269]: E0325 01:36:40.406701 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.434755 kubelet[3269]: E0325 01:36:40.434643 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:40.455912 containerd[1737]: time="2025-03-25T01:36:40.455818788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-546fcb4b79-hwnnz,Uid:1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a,Namespace:calico-system,Attempt:0,}" Mar 25 01:36:40.466168 kubelet[3269]: E0325 01:36:40.466090 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.466168 kubelet[3269]: W0325 01:36:40.466127 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.466554 kubelet[3269]: E0325 01:36:40.466150 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.466996 kubelet[3269]: E0325 01:36:40.466845 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.466996 kubelet[3269]: W0325 01:36:40.466863 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.466996 kubelet[3269]: E0325 01:36:40.466895 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.467181 kubelet[3269]: E0325 01:36:40.467135 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.467181 kubelet[3269]: W0325 01:36:40.467147 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.467181 kubelet[3269]: E0325 01:36:40.467161 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.467561 kubelet[3269]: E0325 01:36:40.467447 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.467561 kubelet[3269]: W0325 01:36:40.467466 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.467561 kubelet[3269]: E0325 01:36:40.467508 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.467947 kubelet[3269]: E0325 01:36:40.467788 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.467947 kubelet[3269]: W0325 01:36:40.467800 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.467947 kubelet[3269]: E0325 01:36:40.467814 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.468109 kubelet[3269]: E0325 01:36:40.468027 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.468109 kubelet[3269]: W0325 01:36:40.468038 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.468109 kubelet[3269]: E0325 01:36:40.468051 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.468267 kubelet[3269]: E0325 01:36:40.468245 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.468267 kubelet[3269]: W0325 01:36:40.468260 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.468354 kubelet[3269]: E0325 01:36:40.468273 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.468557 kubelet[3269]: E0325 01:36:40.468515 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.468557 kubelet[3269]: W0325 01:36:40.468532 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.468557 kubelet[3269]: E0325 01:36:40.468547 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.468892 kubelet[3269]: E0325 01:36:40.468753 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.468892 kubelet[3269]: W0325 01:36:40.468765 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.468892 kubelet[3269]: E0325 01:36:40.468780 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.469185 kubelet[3269]: E0325 01:36:40.469094 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.469185 kubelet[3269]: W0325 01:36:40.469109 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.469185 kubelet[3269]: E0325 01:36:40.469123 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.469639 kubelet[3269]: E0325 01:36:40.469584 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.469639 kubelet[3269]: W0325 01:36:40.469599 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.469639 kubelet[3269]: E0325 01:36:40.469624 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.470142 kubelet[3269]: E0325 01:36:40.470057 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.470142 kubelet[3269]: W0325 01:36:40.470074 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.470142 kubelet[3269]: E0325 01:36:40.470088 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.470601 kubelet[3269]: E0325 01:36:40.470303 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.470601 kubelet[3269]: W0325 01:36:40.470315 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.470601 kubelet[3269]: E0325 01:36:40.470328 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.471022 kubelet[3269]: E0325 01:36:40.470974 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.471022 kubelet[3269]: W0325 01:36:40.471022 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.471280 kubelet[3269]: E0325 01:36:40.471038 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.471280 kubelet[3269]: E0325 01:36:40.471228 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.471280 kubelet[3269]: W0325 01:36:40.471239 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.471280 kubelet[3269]: E0325 01:36:40.471252 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.471585 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.473897 kubelet[3269]: W0325 01:36:40.471597 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.471611 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.472117 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.473897 kubelet[3269]: W0325 01:36:40.472131 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.472144 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.472449 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.473897 kubelet[3269]: W0325 01:36:40.473312 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.473330 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.473897 kubelet[3269]: E0325 01:36:40.473608 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.474259 kubelet[3269]: W0325 01:36:40.473627 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.474259 kubelet[3269]: E0325 01:36:40.473641 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.474983 kubelet[3269]: E0325 01:36:40.474964 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.474983 kubelet[3269]: W0325 01:36:40.474982 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.475155 kubelet[3269]: E0325 01:36:40.474996 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.477113 kubelet[3269]: E0325 01:36:40.476913 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.477113 kubelet[3269]: W0325 01:36:40.476930 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.477113 kubelet[3269]: E0325 01:36:40.476944 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.477113 kubelet[3269]: I0325 01:36:40.476981 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5aa0455c-b83c-4022-b024-2c5a8a7bbcae-registration-dir\") pod \"csi-node-driver-szpdf\" (UID: \"5aa0455c-b83c-4022-b024-2c5a8a7bbcae\") " pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:40.477407 kubelet[3269]: E0325 01:36:40.477391 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.477649 kubelet[3269]: W0325 01:36:40.477503 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.477649 kubelet[3269]: E0325 01:36:40.477626 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.477783 kubelet[3269]: I0325 01:36:40.477658 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5aa0455c-b83c-4022-b024-2c5a8a7bbcae-socket-dir\") pod \"csi-node-driver-szpdf\" (UID: \"5aa0455c-b83c-4022-b024-2c5a8a7bbcae\") " pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:40.478089 kubelet[3269]: E0325 01:36:40.477963 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.478089 kubelet[3269]: W0325 01:36:40.477978 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.478089 kubelet[3269]: E0325 01:36:40.478000 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.478556 kubelet[3269]: E0325 01:36:40.478372 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.478556 kubelet[3269]: W0325 01:36:40.478387 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.478556 kubelet[3269]: E0325 01:36:40.478409 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.479052 kubelet[3269]: E0325 01:36:40.478945 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.479052 kubelet[3269]: W0325 01:36:40.478961 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.479052 kubelet[3269]: E0325 01:36:40.478983 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.479052 kubelet[3269]: I0325 01:36:40.479006 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/5aa0455c-b83c-4022-b024-2c5a8a7bbcae-varrun\") pod \"csi-node-driver-szpdf\" (UID: \"5aa0455c-b83c-4022-b024-2c5a8a7bbcae\") " pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:40.479520 kubelet[3269]: E0325 01:36:40.479497 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.479520 kubelet[3269]: W0325 01:36:40.479519 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.479645 kubelet[3269]: E0325 01:36:40.479539 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.480121 kubelet[3269]: E0325 01:36:40.479890 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.480121 kubelet[3269]: W0325 01:36:40.479905 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.480121 kubelet[3269]: E0325 01:36:40.479919 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.480799 kubelet[3269]: E0325 01:36:40.480748 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.480799 kubelet[3269]: W0325 01:36:40.480798 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.480933 kubelet[3269]: E0325 01:36:40.480825 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.480979 kubelet[3269]: I0325 01:36:40.480952 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdj9h\" (UniqueName: \"kubernetes.io/projected/5aa0455c-b83c-4022-b024-2c5a8a7bbcae-kube-api-access-qdj9h\") pod \"csi-node-driver-szpdf\" (UID: \"5aa0455c-b83c-4022-b024-2c5a8a7bbcae\") " pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:40.481572 kubelet[3269]: E0325 01:36:40.481158 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.481572 kubelet[3269]: W0325 01:36:40.481172 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.481572 kubelet[3269]: E0325 01:36:40.481255 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.481796 kubelet[3269]: E0325 01:36:40.481577 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.481796 kubelet[3269]: W0325 01:36:40.481590 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.481796 kubelet[3269]: E0325 01:36:40.481615 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.481918 kubelet[3269]: E0325 01:36:40.481830 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.481918 kubelet[3269]: W0325 01:36:40.481841 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.481918 kubelet[3269]: E0325 01:36:40.481857 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.481918 kubelet[3269]: I0325 01:36:40.481881 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5aa0455c-b83c-4022-b024-2c5a8a7bbcae-kubelet-dir\") pod \"csi-node-driver-szpdf\" (UID: \"5aa0455c-b83c-4022-b024-2c5a8a7bbcae\") " pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:40.482120 kubelet[3269]: E0325 01:36:40.482100 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.482120 kubelet[3269]: W0325 01:36:40.482112 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.482255 kubelet[3269]: E0325 01:36:40.482136 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.482363 kubelet[3269]: E0325 01:36:40.482328 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.482363 kubelet[3269]: W0325 01:36:40.482344 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.482471 kubelet[3269]: E0325 01:36:40.482454 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.482737 kubelet[3269]: E0325 01:36:40.482719 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.482811 kubelet[3269]: W0325 01:36:40.482753 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.482811 kubelet[3269]: E0325 01:36:40.482768 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.483519 kubelet[3269]: E0325 01:36:40.483018 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.483519 kubelet[3269]: W0325 01:36:40.483034 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.483519 kubelet[3269]: E0325 01:36:40.483048 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.516168 containerd[1737]: time="2025-03-25T01:36:40.516115154Z" level=info msg="connecting to shim 2055d1b4acc23abdfb397b282b6f87bf434d14cb2cf3baca38cd1d7314df0350" address="unix:///run/containerd/s/8b2bdb211176b8c9351e21197a4a1dfb8307ee68f73f59c13b521d8df8318541" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:40.548706 systemd[1]: Started cri-containerd-2055d1b4acc23abdfb397b282b6f87bf434d14cb2cf3baca38cd1d7314df0350.scope - libcontainer container 2055d1b4acc23abdfb397b282b6f87bf434d14cb2cf3baca38cd1d7314df0350. Mar 25 01:36:40.561980 containerd[1737]: time="2025-03-25T01:36:40.561938061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ltctp,Uid:accbb724-6515-4c25-bcfd-7ebc312cd5cc,Namespace:calico-system,Attempt:0,}" Mar 25 01:36:40.583438 kubelet[3269]: E0325 01:36:40.583399 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.583438 kubelet[3269]: W0325 01:36:40.583429 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.583680 kubelet[3269]: E0325 01:36:40.583455 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.584836 kubelet[3269]: E0325 01:36:40.584811 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.584836 kubelet[3269]: W0325 01:36:40.584834 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.585009 kubelet[3269]: E0325 01:36:40.584859 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.585202 kubelet[3269]: E0325 01:36:40.585180 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.585293 kubelet[3269]: W0325 01:36:40.585202 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.585293 kubelet[3269]: E0325 01:36:40.585231 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.585769 kubelet[3269]: E0325 01:36:40.585519 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.585769 kubelet[3269]: W0325 01:36:40.585536 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.585769 kubelet[3269]: E0325 01:36:40.585564 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.586151 kubelet[3269]: E0325 01:36:40.585848 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.586151 kubelet[3269]: W0325 01:36:40.585860 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.586151 kubelet[3269]: E0325 01:36:40.586005 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.587005 kubelet[3269]: E0325 01:36:40.586280 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.587005 kubelet[3269]: W0325 01:36:40.586292 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.587005 kubelet[3269]: E0325 01:36:40.586463 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.587005 kubelet[3269]: E0325 01:36:40.586554 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.587005 kubelet[3269]: W0325 01:36:40.586565 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.587005 kubelet[3269]: E0325 01:36:40.586647 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.587005 kubelet[3269]: E0325 01:36:40.586799 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.587005 kubelet[3269]: W0325 01:36:40.586810 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.587005 kubelet[3269]: E0325 01:36:40.586902 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.587887 kubelet[3269]: E0325 01:36:40.587092 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.587887 kubelet[3269]: W0325 01:36:40.587521 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.587887 kubelet[3269]: E0325 01:36:40.587608 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.587887 kubelet[3269]: E0325 01:36:40.587783 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.587887 kubelet[3269]: W0325 01:36:40.587796 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.587888 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.588024 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.589249 kubelet[3269]: W0325 01:36:40.588034 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.588174 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.588360 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.589249 kubelet[3269]: W0325 01:36:40.588371 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.588445 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.588613 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.589249 kubelet[3269]: W0325 01:36:40.588624 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589249 kubelet[3269]: E0325 01:36:40.588716 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.588855 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.589721 kubelet[3269]: W0325 01:36:40.588868 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.588950 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.589084 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.589721 kubelet[3269]: W0325 01:36:40.589095 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.589110 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.589362 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.589721 kubelet[3269]: W0325 01:36:40.589373 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.589489 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.589721 kubelet[3269]: E0325 01:36:40.589635 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.590122 kubelet[3269]: W0325 01:36:40.589645 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.590122 kubelet[3269]: E0325 01:36:40.589727 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.590122 kubelet[3269]: E0325 01:36:40.589876 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.590122 kubelet[3269]: W0325 01:36:40.589886 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.590122 kubelet[3269]: E0325 01:36:40.589991 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.590306 kubelet[3269]: E0325 01:36:40.590197 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.590306 kubelet[3269]: W0325 01:36:40.590206 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.590405 kubelet[3269]: E0325 01:36:40.590379 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.592623 kubelet[3269]: E0325 01:36:40.590585 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.592623 kubelet[3269]: W0325 01:36:40.590600 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.592623 kubelet[3269]: E0325 01:36:40.590723 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.592623 kubelet[3269]: E0325 01:36:40.590869 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.592623 kubelet[3269]: W0325 01:36:40.590878 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.592623 kubelet[3269]: E0325 01:36:40.590968 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.592623 kubelet[3269]: E0325 01:36:40.591714 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.592623 kubelet[3269]: W0325 01:36:40.591727 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.592623 kubelet[3269]: E0325 01:36:40.591954 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.592623 kubelet[3269]: W0325 01:36:40.591966 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.593055 kubelet[3269]: E0325 01:36:40.592075 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.593055 kubelet[3269]: E0325 01:36:40.592103 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.593055 kubelet[3269]: E0325 01:36:40.592242 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.593055 kubelet[3269]: W0325 01:36:40.592252 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.593055 kubelet[3269]: E0325 01:36:40.592337 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.593055 kubelet[3269]: E0325 01:36:40.592792 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.593055 kubelet[3269]: W0325 01:36:40.592805 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.593055 kubelet[3269]: E0325 01:36:40.592819 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.606092 kubelet[3269]: E0325 01:36:40.606073 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:40.606269 kubelet[3269]: W0325 01:36:40.606201 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:40.606269 kubelet[3269]: E0325 01:36:40.606226 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:40.638165 containerd[1737]: time="2025-03-25T01:36:40.636813888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-546fcb4b79-hwnnz,Uid:1b06ac7e-e600-4623-bdd8-0a72a5d7bc6a,Namespace:calico-system,Attempt:0,} returns sandbox id \"2055d1b4acc23abdfb397b282b6f87bf434d14cb2cf3baca38cd1d7314df0350\"" Mar 25 01:36:40.641601 containerd[1737]: time="2025-03-25T01:36:40.641562941Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:36:40.646032 containerd[1737]: time="2025-03-25T01:36:40.645959689Z" level=info msg="connecting to shim 2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562" address="unix:///run/containerd/s/0468b1f6ac9fbc354159132a5433bf2c5c86f38d775daaa02500b2596ce8b900" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:36:40.683660 systemd[1]: Started cri-containerd-2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562.scope - libcontainer container 2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562. Mar 25 01:36:40.722542 containerd[1737]: time="2025-03-25T01:36:40.722491835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ltctp,Uid:accbb724-6515-4c25-bcfd-7ebc312cd5cc,Namespace:calico-system,Attempt:0,} returns sandbox id \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\"" Mar 25 01:36:42.106062 kubelet[3269]: E0325 01:36:42.105946 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:43.020354 containerd[1737]: time="2025-03-25T01:36:43.020302301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:43.022972 containerd[1737]: time="2025-03-25T01:36:43.022894814Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 25 01:36:43.026754 containerd[1737]: time="2025-03-25T01:36:43.026692533Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:43.030601 containerd[1737]: time="2025-03-25T01:36:43.030550452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:43.031602 containerd[1737]: time="2025-03-25T01:36:43.031161555Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.389554314s" Mar 25 01:36:43.031602 containerd[1737]: time="2025-03-25T01:36:43.031197955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 25 01:36:43.032231 containerd[1737]: time="2025-03-25T01:36:43.032206960Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:36:43.048044 containerd[1737]: time="2025-03-25T01:36:43.047585936Z" level=info msg="CreateContainer within sandbox \"2055d1b4acc23abdfb397b282b6f87bf434d14cb2cf3baca38cd1d7314df0350\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:36:43.073630 containerd[1737]: time="2025-03-25T01:36:43.069311943Z" level=info msg="Container 11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:43.090123 containerd[1737]: time="2025-03-25T01:36:43.090076446Z" level=info msg="CreateContainer within sandbox \"2055d1b4acc23abdfb397b282b6f87bf434d14cb2cf3baca38cd1d7314df0350\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7\"" Mar 25 01:36:43.090822 containerd[1737]: time="2025-03-25T01:36:43.090769749Z" level=info msg="StartContainer for \"11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7\"" Mar 25 01:36:43.092037 containerd[1737]: time="2025-03-25T01:36:43.091997655Z" level=info msg="connecting to shim 11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7" address="unix:///run/containerd/s/8b2bdb211176b8c9351e21197a4a1dfb8307ee68f73f59c13b521d8df8318541" protocol=ttrpc version=3 Mar 25 01:36:43.117635 systemd[1]: Started cri-containerd-11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7.scope - libcontainer container 11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7. Mar 25 01:36:43.168617 containerd[1737]: time="2025-03-25T01:36:43.168550533Z" level=info msg="StartContainer for \"11b4a91c06f5f8f13d075e7b0dddf72add915fe5266cc778dbd7012dad20a8f7\" returns successfully" Mar 25 01:36:43.211440 kubelet[3269]: I0325 01:36:43.211376 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-546fcb4b79-hwnnz" podStartSLOduration=0.819957717 podStartE2EDuration="3.211353845s" podCreationTimestamp="2025-03-25 01:36:40 +0000 UTC" firstStartedPulling="2025-03-25 01:36:40.640661231 +0000 UTC m=+12.639078682" lastFinishedPulling="2025-03-25 01:36:43.032057259 +0000 UTC m=+15.030474810" observedRunningTime="2025-03-25 01:36:43.209801237 +0000 UTC m=+15.208218688" watchObservedRunningTime="2025-03-25 01:36:43.211353845 +0000 UTC m=+15.209771296" Mar 25 01:36:43.292234 kubelet[3269]: E0325 01:36:43.292017 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.292234 kubelet[3269]: W0325 01:36:43.292077 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.292234 kubelet[3269]: E0325 01:36:43.292129 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.292899 kubelet[3269]: E0325 01:36:43.292607 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.292899 kubelet[3269]: W0325 01:36:43.292624 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.292899 kubelet[3269]: E0325 01:36:43.292644 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.293145 kubelet[3269]: E0325 01:36:43.293121 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.293194 kubelet[3269]: W0325 01:36:43.293148 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.293194 kubelet[3269]: E0325 01:36:43.293168 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.293489 kubelet[3269]: E0325 01:36:43.293457 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.293489 kubelet[3269]: W0325 01:36:43.293471 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.293611 kubelet[3269]: E0325 01:36:43.293504 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.293777 kubelet[3269]: E0325 01:36:43.293762 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.293847 kubelet[3269]: W0325 01:36:43.293793 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.293847 kubelet[3269]: E0325 01:36:43.293816 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.294051 kubelet[3269]: E0325 01:36:43.294030 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.294051 kubelet[3269]: W0325 01:36:43.294045 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.294177 kubelet[3269]: E0325 01:36:43.294059 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.294293 kubelet[3269]: E0325 01:36:43.294274 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.294293 kubelet[3269]: W0325 01:36:43.294288 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.294439 kubelet[3269]: E0325 01:36:43.294302 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.294548 kubelet[3269]: E0325 01:36:43.294530 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.294548 kubelet[3269]: W0325 01:36:43.294545 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.294693 kubelet[3269]: E0325 01:36:43.294561 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.294802 kubelet[3269]: E0325 01:36:43.294784 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.294802 kubelet[3269]: W0325 01:36:43.294798 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.294945 kubelet[3269]: E0325 01:36:43.294811 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.295024 kubelet[3269]: E0325 01:36:43.295007 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.295024 kubelet[3269]: W0325 01:36:43.295017 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.295142 kubelet[3269]: E0325 01:36:43.295030 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.295225 kubelet[3269]: E0325 01:36:43.295215 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.295284 kubelet[3269]: W0325 01:36:43.295225 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.295284 kubelet[3269]: E0325 01:36:43.295237 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.295442 kubelet[3269]: E0325 01:36:43.295423 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.295442 kubelet[3269]: W0325 01:36:43.295436 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.295647 kubelet[3269]: E0325 01:36:43.295450 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.295714 kubelet[3269]: E0325 01:36:43.295676 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.295714 kubelet[3269]: W0325 01:36:43.295690 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.295714 kubelet[3269]: E0325 01:36:43.295703 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.295911 kubelet[3269]: E0325 01:36:43.295891 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.295911 kubelet[3269]: W0325 01:36:43.295905 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.296032 kubelet[3269]: E0325 01:36:43.295918 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.296132 kubelet[3269]: E0325 01:36:43.296114 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.296132 kubelet[3269]: W0325 01:36:43.296127 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.296241 kubelet[3269]: E0325 01:36:43.296140 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.304580 kubelet[3269]: E0325 01:36:43.304558 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.304580 kubelet[3269]: W0325 01:36:43.304574 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.304749 kubelet[3269]: E0325 01:36:43.304589 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.304911 kubelet[3269]: E0325 01:36:43.304893 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.304911 kubelet[3269]: W0325 01:36:43.304908 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.305021 kubelet[3269]: E0325 01:36:43.304936 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.305181 kubelet[3269]: E0325 01:36:43.305164 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.305181 kubelet[3269]: W0325 01:36:43.305179 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.305800 kubelet[3269]: E0325 01:36:43.305196 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.305869 kubelet[3269]: E0325 01:36:43.305811 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.305869 kubelet[3269]: W0325 01:36:43.305824 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.305869 kubelet[3269]: E0325 01:36:43.305856 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.306458 kubelet[3269]: E0325 01:36:43.306331 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.306458 kubelet[3269]: W0325 01:36:43.306347 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.306458 kubelet[3269]: E0325 01:36:43.306432 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.306731 kubelet[3269]: E0325 01:36:43.306652 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.306731 kubelet[3269]: W0325 01:36:43.306664 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.306834 kubelet[3269]: E0325 01:36:43.306750 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.306946 kubelet[3269]: E0325 01:36:43.306930 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.306946 kubelet[3269]: W0325 01:36:43.306943 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.307106 kubelet[3269]: E0325 01:36:43.307060 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.307194 kubelet[3269]: E0325 01:36:43.307181 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.307257 kubelet[3269]: W0325 01:36:43.307195 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.307257 kubelet[3269]: E0325 01:36:43.307215 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.307469 kubelet[3269]: E0325 01:36:43.307451 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.307469 kubelet[3269]: W0325 01:36:43.307466 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.307593 kubelet[3269]: E0325 01:36:43.307505 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.307865 kubelet[3269]: E0325 01:36:43.307847 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.307865 kubelet[3269]: W0325 01:36:43.307861 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.308234 kubelet[3269]: E0325 01:36:43.307880 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.308234 kubelet[3269]: E0325 01:36:43.308078 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.308234 kubelet[3269]: W0325 01:36:43.308088 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.308234 kubelet[3269]: E0325 01:36:43.308103 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.308417 kubelet[3269]: E0325 01:36:43.308350 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.308417 kubelet[3269]: W0325 01:36:43.308362 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.308529 kubelet[3269]: E0325 01:36:43.308452 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.308746 kubelet[3269]: E0325 01:36:43.308731 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.308822 kubelet[3269]: W0325 01:36:43.308798 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.308957 kubelet[3269]: E0325 01:36:43.308883 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.309035 kubelet[3269]: E0325 01:36:43.309019 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.309035 kubelet[3269]: W0325 01:36:43.309030 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.309199 kubelet[3269]: E0325 01:36:43.309053 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.309261 kubelet[3269]: E0325 01:36:43.309219 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.309261 kubelet[3269]: W0325 01:36:43.309229 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.309261 kubelet[3269]: E0325 01:36:43.309249 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.309471 kubelet[3269]: E0325 01:36:43.309452 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.309471 kubelet[3269]: W0325 01:36:43.309467 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.309599 kubelet[3269]: E0325 01:36:43.309499 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.309742 kubelet[3269]: E0325 01:36:43.309725 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.309742 kubelet[3269]: W0325 01:36:43.309738 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.309853 kubelet[3269]: E0325 01:36:43.309752 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:43.310282 kubelet[3269]: E0325 01:36:43.310265 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:43.310282 kubelet[3269]: W0325 01:36:43.310278 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:43.310371 kubelet[3269]: E0325 01:36:43.310292 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.104972 kubelet[3269]: E0325 01:36:44.103712 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:44.193570 kubelet[3269]: I0325 01:36:44.193533 3269 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:36:44.201211 kubelet[3269]: E0325 01:36:44.201170 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.201211 kubelet[3269]: W0325 01:36:44.201199 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.201440 kubelet[3269]: E0325 01:36:44.201220 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.201528 kubelet[3269]: E0325 01:36:44.201468 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.201528 kubelet[3269]: W0325 01:36:44.201494 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.201528 kubelet[3269]: E0325 01:36:44.201511 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.201767 kubelet[3269]: E0325 01:36:44.201722 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.201767 kubelet[3269]: W0325 01:36:44.201733 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.201767 kubelet[3269]: E0325 01:36:44.201744 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.202097 kubelet[3269]: E0325 01:36:44.201958 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.202097 kubelet[3269]: W0325 01:36:44.201970 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.202097 kubelet[3269]: E0325 01:36:44.201982 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.202290 kubelet[3269]: E0325 01:36:44.202188 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.202290 kubelet[3269]: W0325 01:36:44.202199 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.202290 kubelet[3269]: E0325 01:36:44.202212 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.202500 kubelet[3269]: E0325 01:36:44.202403 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.202500 kubelet[3269]: W0325 01:36:44.202414 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.202500 kubelet[3269]: E0325 01:36:44.202425 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.202688 kubelet[3269]: E0325 01:36:44.202643 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.202688 kubelet[3269]: W0325 01:36:44.202655 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.202688 kubelet[3269]: E0325 01:36:44.202668 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.202874 kubelet[3269]: E0325 01:36:44.202846 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.202874 kubelet[3269]: W0325 01:36:44.202856 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.202874 kubelet[3269]: E0325 01:36:44.202868 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.203083 kubelet[3269]: E0325 01:36:44.203068 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.203083 kubelet[3269]: W0325 01:36:44.203082 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.203209 kubelet[3269]: E0325 01:36:44.203094 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.203300 kubelet[3269]: E0325 01:36:44.203284 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.203300 kubelet[3269]: W0325 01:36:44.203297 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.203421 kubelet[3269]: E0325 01:36:44.203309 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.203532 kubelet[3269]: E0325 01:36:44.203504 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.203532 kubelet[3269]: W0325 01:36:44.203515 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.203532 kubelet[3269]: E0325 01:36:44.203526 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.203725 kubelet[3269]: E0325 01:36:44.203711 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.203725 kubelet[3269]: W0325 01:36:44.203723 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.203866 kubelet[3269]: E0325 01:36:44.203735 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.203929 kubelet[3269]: E0325 01:36:44.203918 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.203929 kubelet[3269]: W0325 01:36:44.203927 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.204109 kubelet[3269]: E0325 01:36:44.203939 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.204183 kubelet[3269]: E0325 01:36:44.204160 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.204183 kubelet[3269]: W0325 01:36:44.204171 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.204308 kubelet[3269]: E0325 01:36:44.204182 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.204387 kubelet[3269]: E0325 01:36:44.204363 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.204387 kubelet[3269]: W0325 01:36:44.204373 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.204387 kubelet[3269]: E0325 01:36:44.204384 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.210685 kubelet[3269]: E0325 01:36:44.210656 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.210685 kubelet[3269]: W0325 01:36:44.210671 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.210833 kubelet[3269]: E0325 01:36:44.210685 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.210995 kubelet[3269]: E0325 01:36:44.210978 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.210995 kubelet[3269]: W0325 01:36:44.210992 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.211130 kubelet[3269]: E0325 01:36:44.211009 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.211254 kubelet[3269]: E0325 01:36:44.211241 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.211254 kubelet[3269]: W0325 01:36:44.211252 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.211371 kubelet[3269]: E0325 01:36:44.211272 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.211544 kubelet[3269]: E0325 01:36:44.211526 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.211544 kubelet[3269]: W0325 01:36:44.211540 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.211988 kubelet[3269]: E0325 01:36:44.211558 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.211988 kubelet[3269]: E0325 01:36:44.211805 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.211988 kubelet[3269]: W0325 01:36:44.211817 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.211988 kubelet[3269]: E0325 01:36:44.211837 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.212165 kubelet[3269]: E0325 01:36:44.212056 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.212165 kubelet[3269]: W0325 01:36:44.212067 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.212165 kubelet[3269]: E0325 01:36:44.212092 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.212368 kubelet[3269]: E0325 01:36:44.212346 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.212368 kubelet[3269]: W0325 01:36:44.212361 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.212805 kubelet[3269]: E0325 01:36:44.212428 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.212805 kubelet[3269]: E0325 01:36:44.212584 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.212805 kubelet[3269]: W0325 01:36:44.212594 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.212805 kubelet[3269]: E0325 01:36:44.212675 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.212805 kubelet[3269]: E0325 01:36:44.212797 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.212805 kubelet[3269]: W0325 01:36:44.212807 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.213080 kubelet[3269]: E0325 01:36:44.212824 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.213080 kubelet[3269]: E0325 01:36:44.213007 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.213080 kubelet[3269]: W0325 01:36:44.213017 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.213080 kubelet[3269]: E0325 01:36:44.213028 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.213305 kubelet[3269]: E0325 01:36:44.213189 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.213305 kubelet[3269]: W0325 01:36:44.213199 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.213305 kubelet[3269]: E0325 01:36:44.213211 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.213681 kubelet[3269]: E0325 01:36:44.213437 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.213681 kubelet[3269]: W0325 01:36:44.213449 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.213681 kubelet[3269]: E0325 01:36:44.213470 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.213914 kubelet[3269]: E0325 01:36:44.213895 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.213914 kubelet[3269]: W0325 01:36:44.213910 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.214022 kubelet[3269]: E0325 01:36:44.213938 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.214182 kubelet[3269]: E0325 01:36:44.214166 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.214182 kubelet[3269]: W0325 01:36:44.214179 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.214292 kubelet[3269]: E0325 01:36:44.214206 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.214441 kubelet[3269]: E0325 01:36:44.214423 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.214441 kubelet[3269]: W0325 01:36:44.214437 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.214624 kubelet[3269]: E0325 01:36:44.214455 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.214715 kubelet[3269]: E0325 01:36:44.214697 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.214715 kubelet[3269]: W0325 01:36:44.214711 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.214805 kubelet[3269]: E0325 01:36:44.214729 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.215075 kubelet[3269]: E0325 01:36:44.215057 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.215075 kubelet[3269]: W0325 01:36:44.215070 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.215188 kubelet[3269]: E0325 01:36:44.215097 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.215344 kubelet[3269]: E0325 01:36:44.215327 3269 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:36:44.215344 kubelet[3269]: W0325 01:36:44.215340 3269 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:36:44.215431 kubelet[3269]: E0325 01:36:44.215354 3269 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:36:44.593271 containerd[1737]: time="2025-03-25T01:36:44.593218768Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:44.595748 containerd[1737]: time="2025-03-25T01:36:44.595675280Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 25 01:36:44.598648 containerd[1737]: time="2025-03-25T01:36:44.598609294Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:44.603650 containerd[1737]: time="2025-03-25T01:36:44.603589719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:44.605196 containerd[1737]: time="2025-03-25T01:36:44.604597524Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.572355964s" Mar 25 01:36:44.605196 containerd[1737]: time="2025-03-25T01:36:44.604630824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 25 01:36:44.607940 containerd[1737]: time="2025-03-25T01:36:44.607684139Z" level=info msg="CreateContainer within sandbox \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:36:44.627089 containerd[1737]: time="2025-03-25T01:36:44.627052335Z" level=info msg="Container 15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:44.645955 containerd[1737]: time="2025-03-25T01:36:44.645917728Z" level=info msg="CreateContainer within sandbox \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\"" Mar 25 01:36:44.646464 containerd[1737]: time="2025-03-25T01:36:44.646438530Z" level=info msg="StartContainer for \"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\"" Mar 25 01:36:44.648565 containerd[1737]: time="2025-03-25T01:36:44.648276940Z" level=info msg="connecting to shim 15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b" address="unix:///run/containerd/s/0468b1f6ac9fbc354159132a5433bf2c5c86f38d775daaa02500b2596ce8b900" protocol=ttrpc version=3 Mar 25 01:36:44.673656 systemd[1]: Started cri-containerd-15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b.scope - libcontainer container 15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b. Mar 25 01:36:44.715890 containerd[1737]: time="2025-03-25T01:36:44.715834473Z" level=info msg="StartContainer for \"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\" returns successfully" Mar 25 01:36:44.735670 systemd[1]: cri-containerd-15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b.scope: Deactivated successfully. Mar 25 01:36:44.740956 containerd[1737]: time="2025-03-25T01:36:44.740916197Z" level=info msg="TaskExit event in podsandbox handler container_id:\"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\" id:\"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\" pid:3969 exited_at:{seconds:1742866604 nanos:740293394}" Mar 25 01:36:44.741089 containerd[1737]: time="2025-03-25T01:36:44.740924897Z" level=info msg="received exit event container_id:\"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\" id:\"15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b\" pid:3969 exited_at:{seconds:1742866604 nanos:740293394}" Mar 25 01:36:44.762657 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-15ba0a5b6944239cace48c5f54589843ce4a338b24cc92ad71ac6a6b2e64a24b-rootfs.mount: Deactivated successfully. Mar 25 01:36:46.105719 kubelet[3269]: E0325 01:36:46.104387 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:46.203186 containerd[1737]: time="2025-03-25T01:36:46.203139917Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:36:48.106147 kubelet[3269]: E0325 01:36:48.106103 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:50.106554 kubelet[3269]: E0325 01:36:50.106268 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:51.500534 containerd[1737]: time="2025-03-25T01:36:51.500443385Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:51.502425 containerd[1737]: time="2025-03-25T01:36:51.502350506Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 25 01:36:51.505551 containerd[1737]: time="2025-03-25T01:36:51.505467739Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:51.510132 containerd[1737]: time="2025-03-25T01:36:51.510070389Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:36:51.510904 containerd[1737]: time="2025-03-25T01:36:51.510706996Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 5.307517779s" Mar 25 01:36:51.510904 containerd[1737]: time="2025-03-25T01:36:51.510743396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 25 01:36:51.513813 containerd[1737]: time="2025-03-25T01:36:51.513181423Z" level=info msg="CreateContainer within sandbox \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:36:51.531663 containerd[1737]: time="2025-03-25T01:36:51.531624122Z" level=info msg="Container b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:36:51.551487 containerd[1737]: time="2025-03-25T01:36:51.551446936Z" level=info msg="CreateContainer within sandbox \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\"" Mar 25 01:36:51.552170 containerd[1737]: time="2025-03-25T01:36:51.552018742Z" level=info msg="StartContainer for \"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\"" Mar 25 01:36:51.554640 containerd[1737]: time="2025-03-25T01:36:51.554554469Z" level=info msg="connecting to shim b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4" address="unix:///run/containerd/s/0468b1f6ac9fbc354159132a5433bf2c5c86f38d775daaa02500b2596ce8b900" protocol=ttrpc version=3 Mar 25 01:36:51.579623 systemd[1]: Started cri-containerd-b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4.scope - libcontainer container b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4. Mar 25 01:36:51.620152 containerd[1737]: time="2025-03-25T01:36:51.619842075Z" level=info msg="StartContainer for \"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\" returns successfully" Mar 25 01:36:52.105929 kubelet[3269]: E0325 01:36:52.105860 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:53.204069 containerd[1737]: time="2025-03-25T01:36:53.203822386Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: failed to load CNI config list file /etc/cni/net.d/10-calico.conflist: error parsing configuration list: unexpected end of JSON input: invalid cni config: failed to load cni config" Mar 25 01:36:53.205704 systemd[1]: cri-containerd-b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4.scope: Deactivated successfully. Mar 25 01:36:53.206029 systemd[1]: cri-containerd-b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4.scope: Consumed 455ms CPU time, 173.9M memory peak, 154M written to disk. Mar 25 01:36:53.208684 containerd[1737]: time="2025-03-25T01:36:53.208445236Z" level=info msg="received exit event container_id:\"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\" id:\"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\" pid:4026 exited_at:{seconds:1742866613 nanos:208249634}" Mar 25 01:36:53.208847 containerd[1737]: time="2025-03-25T01:36:53.208723339Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\" id:\"b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4\" pid:4026 exited_at:{seconds:1742866613 nanos:208249634}" Mar 25 01:36:53.213837 kubelet[3269]: I0325 01:36:53.213595 3269 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 25 01:36:53.243110 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b99f49a351a514b5d62375c5d6f788c3e470488a4ad88447a361247e4f6f19f4-rootfs.mount: Deactivated successfully. Mar 25 01:36:53.276119 kubelet[3269]: I0325 01:36:53.276069 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b4f7b36-48f2-4d59-acac-c73cfa54d415-tigera-ca-bundle\") pod \"calico-kube-controllers-77fc9c9b48-fjbw6\" (UID: \"2b4f7b36-48f2-4d59-acac-c73cfa54d415\") " pod="calico-system/calico-kube-controllers-77fc9c9b48-fjbw6" Mar 25 01:36:53.276119 kubelet[3269]: I0325 01:36:53.276180 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtmsj\" (UniqueName: \"kubernetes.io/projected/2b4f7b36-48f2-4d59-acac-c73cfa54d415-kube-api-access-wtmsj\") pod \"calico-kube-controllers-77fc9c9b48-fjbw6\" (UID: \"2b4f7b36-48f2-4d59-acac-c73cfa54d415\") " pod="calico-system/calico-kube-controllers-77fc9c9b48-fjbw6" Mar 25 01:36:53.783148 kubelet[3269]: I0325 01:36:53.377185 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ccd98bf2-36fb-48d9-8a98-cf50d6e48549-calico-apiserver-certs\") pod \"calico-apiserver-664776b495-wcxlr\" (UID: \"ccd98bf2-36fb-48d9-8a98-cf50d6e48549\") " pod="calico-apiserver/calico-apiserver-664776b495-wcxlr" Mar 25 01:36:53.783148 kubelet[3269]: I0325 01:36:53.377220 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26lg7\" (UniqueName: \"kubernetes.io/projected/2f283303-5f02-4f4c-9d02-adcf7a86bf1f-kube-api-access-26lg7\") pod \"coredns-668d6bf9bc-dhkch\" (UID: \"2f283303-5f02-4f4c-9d02-adcf7a86bf1f\") " pod="kube-system/coredns-668d6bf9bc-dhkch" Mar 25 01:36:53.783148 kubelet[3269]: I0325 01:36:53.377265 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f283303-5f02-4f4c-9d02-adcf7a86bf1f-config-volume\") pod \"coredns-668d6bf9bc-dhkch\" (UID: \"2f283303-5f02-4f4c-9d02-adcf7a86bf1f\") " pod="kube-system/coredns-668d6bf9bc-dhkch" Mar 25 01:36:53.783148 kubelet[3269]: I0325 01:36:53.377283 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9-config-volume\") pod \"coredns-668d6bf9bc-j26g9\" (UID: \"1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9\") " pod="kube-system/coredns-668d6bf9bc-j26g9" Mar 25 01:36:53.783148 kubelet[3269]: I0325 01:36:53.377301 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fphwn\" (UniqueName: \"kubernetes.io/projected/3be250d5-f6ab-4104-b9ce-65369492294d-kube-api-access-fphwn\") pod \"calico-apiserver-664776b495-tr6kg\" (UID: \"3be250d5-f6ab-4104-b9ce-65369492294d\") " pod="calico-apiserver/calico-apiserver-664776b495-tr6kg" Mar 25 01:36:53.295227 systemd[1]: Created slice kubepods-besteffort-pod2b4f7b36_48f2_4d59_acac_c73cfa54d415.slice - libcontainer container kubepods-besteffort-pod2b4f7b36_48f2_4d59_acac_c73cfa54d415.slice. Mar 25 01:36:53.783629 kubelet[3269]: I0325 01:36:53.377327 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3be250d5-f6ab-4104-b9ce-65369492294d-calico-apiserver-certs\") pod \"calico-apiserver-664776b495-tr6kg\" (UID: \"3be250d5-f6ab-4104-b9ce-65369492294d\") " pod="calico-apiserver/calico-apiserver-664776b495-tr6kg" Mar 25 01:36:53.783629 kubelet[3269]: I0325 01:36:53.377365 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skcph\" (UniqueName: \"kubernetes.io/projected/1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9-kube-api-access-skcph\") pod \"coredns-668d6bf9bc-j26g9\" (UID: \"1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9\") " pod="kube-system/coredns-668d6bf9bc-j26g9" Mar 25 01:36:53.783629 kubelet[3269]: I0325 01:36:53.377395 3269 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvvh\" (UniqueName: \"kubernetes.io/projected/ccd98bf2-36fb-48d9-8a98-cf50d6e48549-kube-api-access-swvvh\") pod \"calico-apiserver-664776b495-wcxlr\" (UID: \"ccd98bf2-36fb-48d9-8a98-cf50d6e48549\") " pod="calico-apiserver/calico-apiserver-664776b495-wcxlr" Mar 25 01:36:53.315774 systemd[1]: Created slice kubepods-burstable-pod2f283303_5f02_4f4c_9d02_adcf7a86bf1f.slice - libcontainer container kubepods-burstable-pod2f283303_5f02_4f4c_9d02_adcf7a86bf1f.slice. Mar 25 01:36:53.324918 systemd[1]: Created slice kubepods-besteffort-podccd98bf2_36fb_48d9_8a98_cf50d6e48549.slice - libcontainer container kubepods-besteffort-podccd98bf2_36fb_48d9_8a98_cf50d6e48549.slice. Mar 25 01:36:53.333223 systemd[1]: Created slice kubepods-burstable-pod1cc53c71_960c_4a7f_8db5_ad1bfcf18cb9.slice - libcontainer container kubepods-burstable-pod1cc53c71_960c_4a7f_8db5_ad1bfcf18cb9.slice. Mar 25 01:36:53.342179 systemd[1]: Created slice kubepods-besteffort-pod3be250d5_f6ab_4104_b9ce_65369492294d.slice - libcontainer container kubepods-besteffort-pod3be250d5_f6ab_4104_b9ce_65369492294d.slice. Mar 25 01:36:54.086171 containerd[1737]: time="2025-03-25T01:36:54.086027816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77fc9c9b48-fjbw6,Uid:2b4f7b36-48f2-4d59-acac-c73cfa54d415,Namespace:calico-system,Attempt:0,}" Mar 25 01:36:54.092193 containerd[1737]: time="2025-03-25T01:36:54.092146682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dhkch,Uid:2f283303-5f02-4f4c-9d02-adcf7a86bf1f,Namespace:kube-system,Attempt:0,}" Mar 25 01:36:54.093748 containerd[1737]: time="2025-03-25T01:36:54.093718899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-tr6kg,Uid:3be250d5-f6ab-4104-b9ce-65369492294d,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:36:54.101737 containerd[1737]: time="2025-03-25T01:36:54.101706386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j26g9,Uid:1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9,Namespace:kube-system,Attempt:0,}" Mar 25 01:36:54.102018 containerd[1737]: time="2025-03-25T01:36:54.101741686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-wcxlr,Uid:ccd98bf2-36fb-48d9-8a98-cf50d6e48549,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:36:54.110106 systemd[1]: Created slice kubepods-besteffort-pod5aa0455c_b83c_4022_b024_2c5a8a7bbcae.slice - libcontainer container kubepods-besteffort-pod5aa0455c_b83c_4022_b024_2c5a8a7bbcae.slice. Mar 25 01:36:54.112215 containerd[1737]: time="2025-03-25T01:36:54.112179299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-szpdf,Uid:5aa0455c-b83c-4022-b024-2c5a8a7bbcae,Namespace:calico-system,Attempt:0,}" Mar 25 01:36:55.240427 containerd[1737]: time="2025-03-25T01:36:55.240372386Z" level=error msg="Failed to destroy network for sandbox \"081211d0926f08d83ed69e17e02120a81ba9dd19e0076263e918bc227892e2f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.246284 containerd[1737]: time="2025-03-25T01:36:55.245649543Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:36:55.250947 containerd[1737]: time="2025-03-25T01:36:55.250732798Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77fc9c9b48-fjbw6,Uid:2b4f7b36-48f2-4d59-acac-c73cfa54d415,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"081211d0926f08d83ed69e17e02120a81ba9dd19e0076263e918bc227892e2f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.254639 kubelet[3269]: E0325 01:36:55.253188 3269 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"081211d0926f08d83ed69e17e02120a81ba9dd19e0076263e918bc227892e2f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.254639 kubelet[3269]: E0325 01:36:55.253255 3269 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"081211d0926f08d83ed69e17e02120a81ba9dd19e0076263e918bc227892e2f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77fc9c9b48-fjbw6" Mar 25 01:36:55.254639 kubelet[3269]: E0325 01:36:55.253283 3269 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"081211d0926f08d83ed69e17e02120a81ba9dd19e0076263e918bc227892e2f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77fc9c9b48-fjbw6" Mar 25 01:36:55.255161 kubelet[3269]: E0325 01:36:55.253345 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-77fc9c9b48-fjbw6_calico-system(2b4f7b36-48f2-4d59-acac-c73cfa54d415)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-77fc9c9b48-fjbw6_calico-system(2b4f7b36-48f2-4d59-acac-c73cfa54d415)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"081211d0926f08d83ed69e17e02120a81ba9dd19e0076263e918bc227892e2f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-77fc9c9b48-fjbw6" podUID="2b4f7b36-48f2-4d59-acac-c73cfa54d415" Mar 25 01:36:55.319673 containerd[1737]: time="2025-03-25T01:36:55.319196638Z" level=error msg="Failed to destroy network for sandbox \"77c382557416a48b5e67f6f592bf0cb0105bae0c4d733fd1cff01f2e9d31baf8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.324741 containerd[1737]: time="2025-03-25T01:36:55.324665897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-wcxlr,Uid:ccd98bf2-36fb-48d9-8a98-cf50d6e48549,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"77c382557416a48b5e67f6f592bf0cb0105bae0c4d733fd1cff01f2e9d31baf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.326878 kubelet[3269]: E0325 01:36:55.325635 3269 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77c382557416a48b5e67f6f592bf0cb0105bae0c4d733fd1cff01f2e9d31baf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.326878 kubelet[3269]: E0325 01:36:55.325724 3269 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77c382557416a48b5e67f6f592bf0cb0105bae0c4d733fd1cff01f2e9d31baf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-664776b495-wcxlr" Mar 25 01:36:55.326878 kubelet[3269]: E0325 01:36:55.325756 3269 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"77c382557416a48b5e67f6f592bf0cb0105bae0c4d733fd1cff01f2e9d31baf8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-664776b495-wcxlr" Mar 25 01:36:55.327094 kubelet[3269]: E0325 01:36:55.325921 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-664776b495-wcxlr_calico-apiserver(ccd98bf2-36fb-48d9-8a98-cf50d6e48549)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-664776b495-wcxlr_calico-apiserver(ccd98bf2-36fb-48d9-8a98-cf50d6e48549)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"77c382557416a48b5e67f6f592bf0cb0105bae0c4d733fd1cff01f2e9d31baf8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-664776b495-wcxlr" podUID="ccd98bf2-36fb-48d9-8a98-cf50d6e48549" Mar 25 01:36:55.346874 containerd[1737]: time="2025-03-25T01:36:55.346719335Z" level=error msg="Failed to destroy network for sandbox \"747c77e3377e9ffd9b332f6d5d9c36557d640c77604685c1b4fd1b802b3715eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.347133 containerd[1737]: time="2025-03-25T01:36:55.346968238Z" level=error msg="Failed to destroy network for sandbox \"5323ffb35fe233f9eca654a2e7b15ac6a888c8525f9d0274230ef2e5f7dd5177\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.349126 containerd[1737]: time="2025-03-25T01:36:55.348931259Z" level=error msg="Failed to destroy network for sandbox \"ca2c486899bfe2ee853f740994cc5dc70612d39a4aa1907e50e47d3a10b1d702\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.351633 containerd[1737]: time="2025-03-25T01:36:55.351542887Z" level=error msg="Failed to destroy network for sandbox \"313a6b4e628c4c1fce2f25ffc14a662fbb8fc02221801fe6dbe8b5076cc83138\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.351857 containerd[1737]: time="2025-03-25T01:36:55.351762090Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-szpdf,Uid:5aa0455c-b83c-4022-b024-2c5a8a7bbcae,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c77e3377e9ffd9b332f6d5d9c36557d640c77604685c1b4fd1b802b3715eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.352368 kubelet[3269]: E0325 01:36:55.352328 3269 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c77e3377e9ffd9b332f6d5d9c36557d640c77604685c1b4fd1b802b3715eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.353020 kubelet[3269]: E0325 01:36:55.352397 3269 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c77e3377e9ffd9b332f6d5d9c36557d640c77604685c1b4fd1b802b3715eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:55.353020 kubelet[3269]: E0325 01:36:55.352425 3269 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"747c77e3377e9ffd9b332f6d5d9c36557d640c77604685c1b4fd1b802b3715eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-szpdf" Mar 25 01:36:55.353020 kubelet[3269]: E0325 01:36:55.352519 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-szpdf_calico-system(5aa0455c-b83c-4022-b024-2c5a8a7bbcae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-szpdf_calico-system(5aa0455c-b83c-4022-b024-2c5a8a7bbcae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"747c77e3377e9ffd9b332f6d5d9c36557d640c77604685c1b4fd1b802b3715eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-szpdf" podUID="5aa0455c-b83c-4022-b024-2c5a8a7bbcae" Mar 25 01:36:55.355613 containerd[1737]: time="2025-03-25T01:36:55.355455030Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j26g9,Uid:1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5323ffb35fe233f9eca654a2e7b15ac6a888c8525f9d0274230ef2e5f7dd5177\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.356353 kubelet[3269]: E0325 01:36:55.356315 3269 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5323ffb35fe233f9eca654a2e7b15ac6a888c8525f9d0274230ef2e5f7dd5177\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.356431 kubelet[3269]: E0325 01:36:55.356376 3269 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5323ffb35fe233f9eca654a2e7b15ac6a888c8525f9d0274230ef2e5f7dd5177\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j26g9" Mar 25 01:36:55.356431 kubelet[3269]: E0325 01:36:55.356400 3269 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5323ffb35fe233f9eca654a2e7b15ac6a888c8525f9d0274230ef2e5f7dd5177\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-j26g9" Mar 25 01:36:55.357152 kubelet[3269]: E0325 01:36:55.356443 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-j26g9_kube-system(1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-j26g9_kube-system(1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5323ffb35fe233f9eca654a2e7b15ac6a888c8525f9d0274230ef2e5f7dd5177\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-j26g9" podUID="1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9" Mar 25 01:36:55.358789 containerd[1737]: time="2025-03-25T01:36:55.358699965Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-tr6kg,Uid:3be250d5-f6ab-4104-b9ce-65369492294d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c486899bfe2ee853f740994cc5dc70612d39a4aa1907e50e47d3a10b1d702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.358930 kubelet[3269]: E0325 01:36:55.358886 3269 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c486899bfe2ee853f740994cc5dc70612d39a4aa1907e50e47d3a10b1d702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.358930 kubelet[3269]: E0325 01:36:55.358929 3269 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c486899bfe2ee853f740994cc5dc70612d39a4aa1907e50e47d3a10b1d702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-664776b495-tr6kg" Mar 25 01:36:55.359065 kubelet[3269]: E0325 01:36:55.358955 3269 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ca2c486899bfe2ee853f740994cc5dc70612d39a4aa1907e50e47d3a10b1d702\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-664776b495-tr6kg" Mar 25 01:36:55.359065 kubelet[3269]: E0325 01:36:55.359000 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-664776b495-tr6kg_calico-apiserver(3be250d5-f6ab-4104-b9ce-65369492294d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-664776b495-tr6kg_calico-apiserver(3be250d5-f6ab-4104-b9ce-65369492294d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ca2c486899bfe2ee853f740994cc5dc70612d39a4aa1907e50e47d3a10b1d702\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-664776b495-tr6kg" podUID="3be250d5-f6ab-4104-b9ce-65369492294d" Mar 25 01:36:55.360998 containerd[1737]: time="2025-03-25T01:36:55.360946389Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dhkch,Uid:2f283303-5f02-4f4c-9d02-adcf7a86bf1f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"313a6b4e628c4c1fce2f25ffc14a662fbb8fc02221801fe6dbe8b5076cc83138\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.362305 kubelet[3269]: E0325 01:36:55.362026 3269 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"313a6b4e628c4c1fce2f25ffc14a662fbb8fc02221801fe6dbe8b5076cc83138\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:36:55.362305 kubelet[3269]: E0325 01:36:55.362095 3269 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"313a6b4e628c4c1fce2f25ffc14a662fbb8fc02221801fe6dbe8b5076cc83138\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dhkch" Mar 25 01:36:55.362305 kubelet[3269]: E0325 01:36:55.362119 3269 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"313a6b4e628c4c1fce2f25ffc14a662fbb8fc02221801fe6dbe8b5076cc83138\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dhkch" Mar 25 01:36:55.362436 kubelet[3269]: E0325 01:36:55.362164 3269 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dhkch_kube-system(2f283303-5f02-4f4c-9d02-adcf7a86bf1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dhkch_kube-system(2f283303-5f02-4f4c-9d02-adcf7a86bf1f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"313a6b4e628c4c1fce2f25ffc14a662fbb8fc02221801fe6dbe8b5076cc83138\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dhkch" podUID="2f283303-5f02-4f4c-9d02-adcf7a86bf1f" Mar 25 01:36:56.046084 systemd[1]: run-netns-cni\x2d79340de6\x2dab2c\x2dab58\x2de901\x2db22d17c5ab8c.mount: Deactivated successfully. Mar 25 01:36:56.046236 systemd[1]: run-netns-cni\x2dbe0a2e55\x2d5abe\x2dcf1f\x2d2dde\x2d722241d40ba0.mount: Deactivated successfully. Mar 25 01:36:56.046340 systemd[1]: run-netns-cni\x2d2b208515\x2d6688\x2df075\x2dcd6d\x2dd3287699954f.mount: Deactivated successfully. Mar 25 01:36:56.046442 systemd[1]: run-netns-cni\x2d0c2ca357\x2daf33\x2da540\x2d27b8\x2db91ed7e8dc18.mount: Deactivated successfully. Mar 25 01:36:56.046577 systemd[1]: run-netns-cni\x2d9379041d\x2d2bb8\x2d15e7\x2df3ad\x2d5bee8afee23b.mount: Deactivated successfully. Mar 25 01:36:58.467459 kubelet[3269]: I0325 01:36:58.467019 3269 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:37:02.908087 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3008914318.mount: Deactivated successfully. Mar 25 01:37:02.956687 containerd[1737]: time="2025-03-25T01:37:02.956638088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:02.959361 containerd[1737]: time="2025-03-25T01:37:02.959279616Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 25 01:37:02.962556 containerd[1737]: time="2025-03-25T01:37:02.962525051Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:02.966468 containerd[1737]: time="2025-03-25T01:37:02.966418693Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:02.967081 containerd[1737]: time="2025-03-25T01:37:02.966922698Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 7.721222554s" Mar 25 01:37:02.967081 containerd[1737]: time="2025-03-25T01:37:02.966966898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 25 01:37:02.976407 containerd[1737]: time="2025-03-25T01:37:02.975399289Z" level=info msg="CreateContainer within sandbox \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:37:03.004507 containerd[1737]: time="2025-03-25T01:37:03.000469657Z" level=info msg="Container ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:03.026340 containerd[1737]: time="2025-03-25T01:37:03.026291633Z" level=info msg="CreateContainer within sandbox \"2774897b94846bafabc8eeaf51d3e8b5ab1938714ecb65ad2635e5302bc9d562\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\"" Mar 25 01:37:03.027234 containerd[1737]: time="2025-03-25T01:37:03.026947140Z" level=info msg="StartContainer for \"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\"" Mar 25 01:37:03.028968 containerd[1737]: time="2025-03-25T01:37:03.028931361Z" level=info msg="connecting to shim ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4" address="unix:///run/containerd/s/0468b1f6ac9fbc354159132a5433bf2c5c86f38d775daaa02500b2596ce8b900" protocol=ttrpc version=3 Mar 25 01:37:03.049650 systemd[1]: Started cri-containerd-ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4.scope - libcontainer container ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4. Mar 25 01:37:03.093795 containerd[1737]: time="2025-03-25T01:37:03.093745654Z" level=info msg="StartContainer for \"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" returns successfully" Mar 25 01:37:03.292493 kubelet[3269]: I0325 01:37:03.291143 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ltctp" podStartSLOduration=1.047252007 podStartE2EDuration="23.291121764s" podCreationTimestamp="2025-03-25 01:36:40 +0000 UTC" firstStartedPulling="2025-03-25 01:36:40.723958151 +0000 UTC m=+12.722375602" lastFinishedPulling="2025-03-25 01:37:02.967827908 +0000 UTC m=+34.966245359" observedRunningTime="2025-03-25 01:37:03.290438757 +0000 UTC m=+35.288856308" watchObservedRunningTime="2025-03-25 01:37:03.291121764 +0000 UTC m=+35.289539215" Mar 25 01:37:03.550727 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:37:03.550897 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld <Jason@zx2c4.com>. All Rights Reserved. Mar 25 01:37:05.165595 kernel: bpftool[4452]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:37:05.425936 systemd-networkd[1573]: vxlan.calico: Link UP Mar 25 01:37:05.425946 systemd-networkd[1573]: vxlan.calico: Gained carrier Mar 25 01:37:06.105367 containerd[1737]: time="2025-03-25T01:37:06.105036251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j26g9,Uid:1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9,Namespace:kube-system,Attempt:0,}" Mar 25 01:37:06.105367 containerd[1737]: time="2025-03-25T01:37:06.105036751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-tr6kg,Uid:3be250d5-f6ab-4104-b9ce-65369492294d,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:37:06.253220 systemd-networkd[1573]: cali62b5143ad66: Link UP Mar 25 01:37:06.253438 systemd-networkd[1573]: cali62b5143ad66: Gained carrier Mar 25 01:37:06.273374 containerd[1737]: 2025-03-25 01:37:06.170 [INFO][4530] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0 calico-apiserver-664776b495- calico-apiserver 3be250d5-f6ab-4104-b9ce-65369492294d 709 0 2025-03-25 01:36:40 +0000 UTC <nil> <nil> map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:664776b495 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284.0.0-a-0ecc1f6a74 calico-apiserver-664776b495-tr6kg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali62b5143ad66 [] []}} ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-" Mar 25 01:37:06.273374 containerd[1737]: 2025-03-25 01:37:06.170 [INFO][4530] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.273374 containerd[1737]: 2025-03-25 01:37:06.210 [INFO][4548] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" HandleID="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.220 [INFO][4548] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" HandleID="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002907c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284.0.0-a-0ecc1f6a74", "pod":"calico-apiserver-664776b495-tr6kg", "timestamp":"2025-03-25 01:37:06.210205075 +0000 UTC"}, Hostname:"ci-4284.0.0-a-0ecc1f6a74", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.220 [INFO][4548] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.220 [INFO][4548] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.220 [INFO][4548] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-0ecc1f6a74' Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.222 [INFO][4548] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.225 [INFO][4548] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.229 [INFO][4548] ipam/ipam.go 489: Trying affinity for 192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.230 [INFO][4548] ipam/ipam.go 155: Attempting to load block cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.273719 containerd[1737]: 2025-03-25 01:37:06.232 [INFO][4548] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.232 [INFO][4548] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.7.128/26 handle="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.235 [INFO][4548] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92 Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.241 [INFO][4548] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.7.128/26 handle="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.247 [INFO][4548] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.7.129/26] block=192.168.7.128/26 handle="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.247 [INFO][4548] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.7.129/26] handle="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.247 [INFO][4548] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:37:06.274058 containerd[1737]: 2025-03-25 01:37:06.247 [INFO][4548] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.129/26] IPv6=[] ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" HandleID="k8s-pod-network.52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.274941 containerd[1737]: 2025-03-25 01:37:06.250 [INFO][4530] cni-plugin/k8s.go 386: Populated endpoint ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0", GenerateName:"calico-apiserver-664776b495-", Namespace:"calico-apiserver", SelfLink:"", UID:"3be250d5-f6ab-4104-b9ce-65369492294d", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"664776b495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"", Pod:"calico-apiserver-664776b495-tr6kg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali62b5143ad66", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:06.275123 containerd[1737]: 2025-03-25 01:37:06.250 [INFO][4530] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.7.129/32] ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.275123 containerd[1737]: 2025-03-25 01:37:06.250 [INFO][4530] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali62b5143ad66 ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.275123 containerd[1737]: 2025-03-25 01:37:06.253 [INFO][4530] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.275314 containerd[1737]: 2025-03-25 01:37:06.254 [INFO][4530] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0", GenerateName:"calico-apiserver-664776b495-", Namespace:"calico-apiserver", SelfLink:"", UID:"3be250d5-f6ab-4104-b9ce-65369492294d", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"664776b495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92", Pod:"calico-apiserver-664776b495-tr6kg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali62b5143ad66", MAC:"82:bf:c3:f8:6f:13", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:06.275439 containerd[1737]: 2025-03-25 01:37:06.270 [INFO][4530] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-tr6kg" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--tr6kg-eth0" Mar 25 01:37:06.326889 containerd[1737]: time="2025-03-25T01:37:06.326696121Z" level=info msg="connecting to shim 52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92" address="unix:///run/containerd/s/e3e78dcb778d069ff5984e432a9fe98ca085a15f15fffdf7d2f726e8157518ed" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:37:06.368665 systemd[1]: Started cri-containerd-52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92.scope - libcontainer container 52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92. Mar 25 01:37:06.387669 systemd-networkd[1573]: calia4e7a966641: Link UP Mar 25 01:37:06.388525 systemd-networkd[1573]: calia4e7a966641: Gained carrier Mar 25 01:37:06.405702 containerd[1737]: 2025-03-25 01:37:06.170 [INFO][4523] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0 coredns-668d6bf9bc- kube-system 1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9 705 0 2025-03-25 01:36:33 +0000 UTC <nil> <nil> map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284.0.0-a-0ecc1f6a74 coredns-668d6bf9bc-j26g9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia4e7a966641 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-" Mar 25 01:37:06.405702 containerd[1737]: 2025-03-25 01:37:06.171 [INFO][4523] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.405702 containerd[1737]: 2025-03-25 01:37:06.211 [INFO][4549] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" HandleID="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.221 [INFO][4549] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" HandleID="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003bdc50), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284.0.0-a-0ecc1f6a74", "pod":"coredns-668d6bf9bc-j26g9", "timestamp":"2025-03-25 01:37:06.211044984 +0000 UTC"}, Hostname:"ci-4284.0.0-a-0ecc1f6a74", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.221 [INFO][4549] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.247 [INFO][4549] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.247 [INFO][4549] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-0ecc1f6a74' Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.327 [INFO][4549] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.340 [INFO][4549] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.348 [INFO][4549] ipam/ipam.go 489: Trying affinity for 192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.350 [INFO][4549] ipam/ipam.go 155: Attempting to load block cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.405984 containerd[1737]: 2025-03-25 01:37:06.353 [INFO][4549] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.354 [INFO][4549] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.7.128/26 handle="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.358 [INFO][4549] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7 Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.365 [INFO][4549] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.7.128/26 handle="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.381 [INFO][4549] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.7.130/26] block=192.168.7.128/26 handle="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.382 [INFO][4549] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.7.130/26] handle="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.382 [INFO][4549] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:37:06.406345 containerd[1737]: 2025-03-25 01:37:06.382 [INFO][4549] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.130/26] IPv6=[] ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" HandleID="k8s-pod-network.13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.407535 containerd[1737]: 2025-03-25 01:37:06.384 [INFO][4523] cni-plugin/k8s.go 386: Populated endpoint ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9", ResourceVersion:"705", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 33, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"", Pod:"coredns-668d6bf9bc-j26g9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia4e7a966641", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:06.407535 containerd[1737]: 2025-03-25 01:37:06.384 [INFO][4523] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.7.130/32] ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.407535 containerd[1737]: 2025-03-25 01:37:06.384 [INFO][4523] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia4e7a966641 ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.407535 containerd[1737]: 2025-03-25 01:37:06.388 [INFO][4523] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.407535 containerd[1737]: 2025-03-25 01:37:06.389 [INFO][4523] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9", ResourceVersion:"705", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 33, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7", Pod:"coredns-668d6bf9bc-j26g9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia4e7a966641", MAC:"0a:9e:3d:7d:db:1d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:06.407535 containerd[1737]: 2025-03-25 01:37:06.401 [INFO][4523] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" Namespace="kube-system" Pod="coredns-668d6bf9bc-j26g9" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--j26g9-eth0" Mar 25 01:37:06.452512 containerd[1737]: time="2025-03-25T01:37:06.450920449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-tr6kg,Uid:3be250d5-f6ab-4104-b9ce-65369492294d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92\"" Mar 25 01:37:06.456201 containerd[1737]: time="2025-03-25T01:37:06.456021804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:37:06.462820 containerd[1737]: time="2025-03-25T01:37:06.462756776Z" level=info msg="connecting to shim 13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7" address="unix:///run/containerd/s/40181be75a7cceef0bd7c47ae697ff40233a230d78c1da9f5c1ae2c410bdb928" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:37:06.486748 systemd[1]: Started cri-containerd-13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7.scope - libcontainer container 13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7. Mar 25 01:37:06.542413 containerd[1737]: time="2025-03-25T01:37:06.542357727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-j26g9,Uid:1cc53c71-960c-4a7f-8db5-ad1bfcf18cb9,Namespace:kube-system,Attempt:0,} returns sandbox id \"13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7\"" Mar 25 01:37:06.545082 containerd[1737]: time="2025-03-25T01:37:06.544948354Z" level=info msg="CreateContainer within sandbox \"13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:37:06.565494 containerd[1737]: time="2025-03-25T01:37:06.565453374Z" level=info msg="Container 21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:06.575509 containerd[1737]: time="2025-03-25T01:37:06.575456481Z" level=info msg="CreateContainer within sandbox \"13f639adcce3af6d85881fa9d537564468025190b09de90d8480725fa8f0b0b7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514\"" Mar 25 01:37:06.576969 containerd[1737]: time="2025-03-25T01:37:06.576005686Z" level=info msg="StartContainer for \"21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514\"" Mar 25 01:37:06.577131 containerd[1737]: time="2025-03-25T01:37:06.577106298Z" level=info msg="connecting to shim 21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514" address="unix:///run/containerd/s/40181be75a7cceef0bd7c47ae697ff40233a230d78c1da9f5c1ae2c410bdb928" protocol=ttrpc version=3 Mar 25 01:37:06.596649 systemd[1]: Started cri-containerd-21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514.scope - libcontainer container 21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514. Mar 25 01:37:06.627771 containerd[1737]: time="2025-03-25T01:37:06.627591638Z" level=info msg="StartContainer for \"21ea0bad26ef07b821299a84e69eab2b28a624abfb3e163c8d0bac9fcbbc8514\" returns successfully" Mar 25 01:37:07.016734 systemd-networkd[1573]: vxlan.calico: Gained IPv6LL Mar 25 01:37:07.104972 containerd[1737]: time="2025-03-25T01:37:07.104912812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-szpdf,Uid:5aa0455c-b83c-4022-b024-2c5a8a7bbcae,Namespace:calico-system,Attempt:0,}" Mar 25 01:37:07.224202 systemd-networkd[1573]: cali6ce8915b662: Link UP Mar 25 01:37:07.224960 systemd-networkd[1573]: cali6ce8915b662: Gained carrier Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.148 [INFO][4712] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0 csi-node-driver- calico-system 5aa0455c-b83c-4022-b024-2c5a8a7bbcae 617 0 2025-03-25 01:36:40 +0000 UTC <nil> <nil> map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284.0.0-a-0ecc1f6a74 csi-node-driver-szpdf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6ce8915b662 [] []}} ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.148 [INFO][4712] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.177 [INFO][4725] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" HandleID="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.185 [INFO][4725] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" HandleID="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000311660), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284.0.0-a-0ecc1f6a74", "pod":"csi-node-driver-szpdf", "timestamp":"2025-03-25 01:37:07.177235303 +0000 UTC"}, Hostname:"ci-4284.0.0-a-0ecc1f6a74", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.185 [INFO][4725] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.185 [INFO][4725] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.185 [INFO][4725] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-0ecc1f6a74' Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.187 [INFO][4725] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.191 [INFO][4725] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.195 [INFO][4725] ipam/ipam.go 489: Trying affinity for 192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.197 [INFO][4725] ipam/ipam.go 155: Attempting to load block cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.199 [INFO][4725] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.199 [INFO][4725] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.7.128/26 handle="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.200 [INFO][4725] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411 Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.204 [INFO][4725] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.7.128/26 handle="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.219 [INFO][4725] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.7.131/26] block=192.168.7.128/26 handle="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.219 [INFO][4725] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.7.131/26] handle="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.219 [INFO][4725] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:37:07.240662 containerd[1737]: 2025-03-25 01:37:07.219 [INFO][4725] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.131/26] IPv6=[] ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" HandleID="k8s-pod-network.036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.243346 containerd[1737]: 2025-03-25 01:37:07.221 [INFO][4712] cni-plugin/k8s.go 386: Populated endpoint ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5aa0455c-b83c-4022-b024-2c5a8a7bbcae", ResourceVersion:"617", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"", Pod:"csi-node-driver-szpdf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.7.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6ce8915b662", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:07.243346 containerd[1737]: 2025-03-25 01:37:07.221 [INFO][4712] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.7.131/32] ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.243346 containerd[1737]: 2025-03-25 01:37:07.221 [INFO][4712] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6ce8915b662 ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.243346 containerd[1737]: 2025-03-25 01:37:07.225 [INFO][4712] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.243346 containerd[1737]: 2025-03-25 01:37:07.225 [INFO][4712] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"5aa0455c-b83c-4022-b024-2c5a8a7bbcae", ResourceVersion:"617", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411", Pod:"csi-node-driver-szpdf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.7.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6ce8915b662", MAC:"9e:7f:d4:b1:d1:99", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:07.243346 containerd[1737]: 2025-03-25 01:37:07.237 [INFO][4712] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" Namespace="calico-system" Pod="csi-node-driver-szpdf" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-csi--node--driver--szpdf-eth0" Mar 25 01:37:07.300548 containerd[1737]: time="2025-03-25T01:37:07.299307299Z" level=info msg="connecting to shim 036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411" address="unix:///run/containerd/s/a48df458173002e1b5618773fb1a342206fd980e8b4e17350dca38af8b62ff57" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:37:07.339681 systemd[1]: Started cri-containerd-036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411.scope - libcontainer container 036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411. Mar 25 01:37:07.340974 kubelet[3269]: I0325 01:37:07.340757 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-j26g9" podStartSLOduration=34.340605637 podStartE2EDuration="34.340605637s" podCreationTimestamp="2025-03-25 01:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:37:07.317683149 +0000 UTC m=+39.316100700" watchObservedRunningTime="2025-03-25 01:37:07.340605637 +0000 UTC m=+39.339023088" Mar 25 01:37:07.352766 kubelet[3269]: I0325 01:37:07.352152 3269 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:37:07.410515 containerd[1737]: time="2025-03-25T01:37:07.410398907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-szpdf,Uid:5aa0455c-b83c-4022-b024-2c5a8a7bbcae,Namespace:calico-system,Attempt:0,} returns sandbox id \"036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411\"" Mar 25 01:37:07.446967 containerd[1737]: time="2025-03-25T01:37:07.446908105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" id:\"dba4a0ec8e825319621a5e0ea32e9e736bd65169ead1a750a7b14943c030ec93\" pid:4799 exited_at:{seconds:1742866627 nanos:445578094}" Mar 25 01:37:07.535400 containerd[1737]: time="2025-03-25T01:37:07.535354027Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" id:\"197f5bc57b596d9dccbcc717329c311675e7acf64df9f2328be40ffccc19b117\" pid:4832 exited_at:{seconds:1742866627 nanos:534940823}" Mar 25 01:37:08.104983 systemd-networkd[1573]: cali62b5143ad66: Gained IPv6LL Mar 25 01:37:08.106703 containerd[1737]: time="2025-03-25T01:37:08.106518990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77fc9c9b48-fjbw6,Uid:2b4f7b36-48f2-4d59-acac-c73cfa54d415,Namespace:calico-system,Attempt:0,}" Mar 25 01:37:08.108059 containerd[1737]: time="2025-03-25T01:37:08.107909802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-wcxlr,Uid:ccd98bf2-36fb-48d9-8a98-cf50d6e48549,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:37:08.169193 systemd-networkd[1573]: calia4e7a966641: Gained IPv6LL Mar 25 01:37:08.296750 systemd-networkd[1573]: cali6ce8915b662: Gained IPv6LL Mar 25 01:37:08.348611 systemd-networkd[1573]: cali858f58d36dd: Link UP Mar 25 01:37:08.349789 systemd-networkd[1573]: cali858f58d36dd: Gained carrier Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.195 [INFO][4843] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0 calico-kube-controllers-77fc9c9b48- calico-system 2b4f7b36-48f2-4d59-acac-c73cfa54d415 702 0 2025-03-25 01:36:40 +0000 UTC <nil> <nil> map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:77fc9c9b48 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284.0.0-a-0ecc1f6a74 calico-kube-controllers-77fc9c9b48-fjbw6 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali858f58d36dd [] []}} ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.196 [INFO][4843] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.290 [INFO][4868] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" HandleID="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.304 [INFO][4868] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" HandleID="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002651e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284.0.0-a-0ecc1f6a74", "pod":"calico-kube-controllers-77fc9c9b48-fjbw6", "timestamp":"2025-03-25 01:37:08.289866287 +0000 UTC"}, Hostname:"ci-4284.0.0-a-0ecc1f6a74", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.304 [INFO][4868] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.304 [INFO][4868] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.304 [INFO][4868] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-0ecc1f6a74' Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.307 [INFO][4868] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.313 [INFO][4868] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.321 [INFO][4868] ipam/ipam.go 489: Trying affinity for 192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.325 [INFO][4868] ipam/ipam.go 155: Attempting to load block cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.328 [INFO][4868] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.328 [INFO][4868] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.7.128/26 handle="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.329 [INFO][4868] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.333 [INFO][4868] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.7.128/26 handle="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.342 [INFO][4868] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.7.132/26] block=192.168.7.128/26 handle="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.342 [INFO][4868] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.7.132/26] handle="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.342 [INFO][4868] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:37:08.380743 containerd[1737]: 2025-03-25 01:37:08.342 [INFO][4868] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.132/26] IPv6=[] ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" HandleID="k8s-pod-network.9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.381915 containerd[1737]: 2025-03-25 01:37:08.345 [INFO][4843] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0", GenerateName:"calico-kube-controllers-77fc9c9b48-", Namespace:"calico-system", SelfLink:"", UID:"2b4f7b36-48f2-4d59-acac-c73cfa54d415", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77fc9c9b48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"", Pod:"calico-kube-controllers-77fc9c9b48-fjbw6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.7.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali858f58d36dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:08.381915 containerd[1737]: 2025-03-25 01:37:08.345 [INFO][4843] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.7.132/32] ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.381915 containerd[1737]: 2025-03-25 01:37:08.345 [INFO][4843] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali858f58d36dd ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.381915 containerd[1737]: 2025-03-25 01:37:08.349 [INFO][4843] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.381915 containerd[1737]: 2025-03-25 01:37:08.350 [INFO][4843] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0", GenerateName:"calico-kube-controllers-77fc9c9b48-", Namespace:"calico-system", SelfLink:"", UID:"2b4f7b36-48f2-4d59-acac-c73cfa54d415", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77fc9c9b48", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b", Pod:"calico-kube-controllers-77fc9c9b48-fjbw6", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.7.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali858f58d36dd", MAC:"1e:79:4f:a0:20:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:08.381915 containerd[1737]: 2025-03-25 01:37:08.377 [INFO][4843] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" Namespace="calico-system" Pod="calico-kube-controllers-77fc9c9b48-fjbw6" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--kube--controllers--77fc9c9b48--fjbw6-eth0" Mar 25 01:37:08.442823 systemd-networkd[1573]: cali8f1e6892591: Link UP Mar 25 01:37:08.443624 systemd-networkd[1573]: cali8f1e6892591: Gained carrier Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.228 [INFO][4853] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0 calico-apiserver-664776b495- calico-apiserver ccd98bf2-36fb-48d9-8a98-cf50d6e48549 708 0 2025-03-25 01:36:40 +0000 UTC <nil> <nil> map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:664776b495 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284.0.0-a-0ecc1f6a74 calico-apiserver-664776b495-wcxlr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8f1e6892591 [] []}} ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.228 [INFO][4853] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.310 [INFO][4874] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" HandleID="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.325 [INFO][4874] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" HandleID="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000135700), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284.0.0-a-0ecc1f6a74", "pod":"calico-apiserver-664776b495-wcxlr", "timestamp":"2025-03-25 01:37:08.310474356 +0000 UTC"}, Hostname:"ci-4284.0.0-a-0ecc1f6a74", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.325 [INFO][4874] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.343 [INFO][4874] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.343 [INFO][4874] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-0ecc1f6a74' Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.407 [INFO][4874] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.412 [INFO][4874] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.418 [INFO][4874] ipam/ipam.go 489: Trying affinity for 192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.420 [INFO][4874] ipam/ipam.go 155: Attempting to load block cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.422 [INFO][4874] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.422 [INFO][4874] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.7.128/26 handle="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.423 [INFO][4874] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8 Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.427 [INFO][4874] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.7.128/26 handle="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.435 [INFO][4874] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.7.133/26] block=192.168.7.128/26 handle="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.435 [INFO][4874] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.7.133/26] handle="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.435 [INFO][4874] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:37:08.460107 containerd[1737]: 2025-03-25 01:37:08.435 [INFO][4874] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.133/26] IPv6=[] ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" HandleID="k8s-pod-network.2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.462115 containerd[1737]: 2025-03-25 01:37:08.438 [INFO][4853] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0", GenerateName:"calico-apiserver-664776b495-", Namespace:"calico-apiserver", SelfLink:"", UID:"ccd98bf2-36fb-48d9-8a98-cf50d6e48549", ResourceVersion:"708", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"664776b495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"", Pod:"calico-apiserver-664776b495-wcxlr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8f1e6892591", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:08.462115 containerd[1737]: 2025-03-25 01:37:08.438 [INFO][4853] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.7.133/32] ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.462115 containerd[1737]: 2025-03-25 01:37:08.438 [INFO][4853] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8f1e6892591 ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.462115 containerd[1737]: 2025-03-25 01:37:08.444 [INFO][4853] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.462115 containerd[1737]: 2025-03-25 01:37:08.444 [INFO][4853] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0", GenerateName:"calico-apiserver-664776b495-", Namespace:"calico-apiserver", SelfLink:"", UID:"ccd98bf2-36fb-48d9-8a98-cf50d6e48549", ResourceVersion:"708", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 40, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"664776b495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8", Pod:"calico-apiserver-664776b495-wcxlr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.7.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8f1e6892591", MAC:"b6:65:f0:be:2a:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:08.462115 containerd[1737]: 2025-03-25 01:37:08.457 [INFO][4853] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" Namespace="calico-apiserver" Pod="calico-apiserver-664776b495-wcxlr" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-calico--apiserver--664776b495--wcxlr-eth0" Mar 25 01:37:08.791507 containerd[1737]: time="2025-03-25T01:37:08.791341482Z" level=info msg="connecting to shim 9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b" address="unix:///run/containerd/s/79d9f05b04979fbceefdb9ad4b19c1bc4ca5d6ccd299c55a482b2653cdb5417d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:37:08.830241 containerd[1737]: time="2025-03-25T01:37:08.830194599Z" level=info msg="connecting to shim 2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8" address="unix:///run/containerd/s/34a20c2be0863fe7ea52f346e167771ca49772bc8b6aed6d3eb714d1f827de5d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:37:08.854157 systemd[1]: Started cri-containerd-9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b.scope - libcontainer container 9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b. Mar 25 01:37:08.888865 systemd[1]: Started cri-containerd-2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8.scope - libcontainer container 2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8. Mar 25 01:37:08.984257 containerd[1737]: time="2025-03-25T01:37:08.984214457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77fc9c9b48-fjbw6,Uid:2b4f7b36-48f2-4d59-acac-c73cfa54d415,Namespace:calico-system,Attempt:0,} returns sandbox id \"9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b\"" Mar 25 01:37:09.001149 containerd[1737]: time="2025-03-25T01:37:09.000893993Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-664776b495-wcxlr,Uid:ccd98bf2-36fb-48d9-8a98-cf50d6e48549,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8\"" Mar 25 01:37:09.105232 containerd[1737]: time="2025-03-25T01:37:09.104910842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dhkch,Uid:2f283303-5f02-4f4c-9d02-adcf7a86bf1f,Namespace:kube-system,Attempt:0,}" Mar 25 01:37:09.292381 systemd-networkd[1573]: cali4d3e4600d54: Link UP Mar 25 01:37:09.293555 systemd-networkd[1573]: cali4d3e4600d54: Gained carrier Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.176 [INFO][5005] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0 coredns-668d6bf9bc- kube-system 2f283303-5f02-4f4c-9d02-adcf7a86bf1f 706 0 2025-03-25 01:36:33 +0000 UTC <nil> <nil> map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284.0.0-a-0ecc1f6a74 coredns-668d6bf9bc-dhkch eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4d3e4600d54 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.176 [INFO][5005] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.220 [INFO][5019] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" HandleID="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.234 [INFO][5019] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" HandleID="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031afa0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284.0.0-a-0ecc1f6a74", "pod":"coredns-668d6bf9bc-dhkch", "timestamp":"2025-03-25 01:37:09.220425685 +0000 UTC"}, Hostname:"ci-4284.0.0-a-0ecc1f6a74", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.234 [INFO][5019] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.234 [INFO][5019] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.234 [INFO][5019] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284.0.0-a-0ecc1f6a74' Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.237 [INFO][5019] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.241 [INFO][5019] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.248 [INFO][5019] ipam/ipam.go 489: Trying affinity for 192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.251 [INFO][5019] ipam/ipam.go 155: Attempting to load block cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.255 [INFO][5019] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.7.128/26 host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.255 [INFO][5019] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.7.128/26 handle="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.257 [INFO][5019] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767 Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.271 [INFO][5019] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.7.128/26 handle="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.284 [INFO][5019] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.7.134/26] block=192.168.7.128/26 handle="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.285 [INFO][5019] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.7.134/26] handle="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" host="ci-4284.0.0-a-0ecc1f6a74" Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.285 [INFO][5019] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:37:09.321507 containerd[1737]: 2025-03-25 01:37:09.285 [INFO][5019] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.7.134/26] IPv6=[] ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" HandleID="k8s-pod-network.f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Workload="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.322439 containerd[1737]: 2025-03-25 01:37:09.288 [INFO][5005] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2f283303-5f02-4f4c-9d02-adcf7a86bf1f", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 33, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"", Pod:"coredns-668d6bf9bc-dhkch", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4d3e4600d54", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:09.322439 containerd[1737]: 2025-03-25 01:37:09.288 [INFO][5005] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.7.134/32] ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.322439 containerd[1737]: 2025-03-25 01:37:09.288 [INFO][5005] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4d3e4600d54 ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.322439 containerd[1737]: 2025-03-25 01:37:09.294 [INFO][5005] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.322439 containerd[1737]: 2025-03-25 01:37:09.294 [INFO][5005] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2f283303-5f02-4f4c-9d02-adcf7a86bf1f", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 36, 33, 0, time.Local), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284.0.0-a-0ecc1f6a74", ContainerID:"f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767", Pod:"coredns-668d6bf9bc-dhkch", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.7.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4d3e4600d54", MAC:"16:28:62:f6:80:4f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:37:09.322439 containerd[1737]: 2025-03-25 01:37:09.314 [INFO][5005] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" Namespace="kube-system" Pod="coredns-668d6bf9bc-dhkch" WorkloadEndpoint="ci--4284.0.0--a--0ecc1f6a74-k8s-coredns--668d6bf9bc--dhkch-eth0" Mar 25 01:37:09.391085 containerd[1737]: time="2025-03-25T01:37:09.390421974Z" level=info msg="connecting to shim f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767" address="unix:///run/containerd/s/d37bb22d6b0977f241073de87010ef2f24b8eb00edc0b995f76f3c877cf48683" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:37:09.428111 systemd[1]: Started cri-containerd-f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767.scope - libcontainer container f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767. Mar 25 01:37:09.576709 systemd-networkd[1573]: cali8f1e6892591: Gained IPv6LL Mar 25 01:37:09.606460 containerd[1737]: time="2025-03-25T01:37:09.606409437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dhkch,Uid:2f283303-5f02-4f4c-9d02-adcf7a86bf1f,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767\"" Mar 25 01:37:09.614265 containerd[1737]: time="2025-03-25T01:37:09.614219601Z" level=info msg="CreateContainer within sandbox \"f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:37:09.654582 containerd[1737]: time="2025-03-25T01:37:09.652792016Z" level=info msg="Container d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:09.661370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount674107038.mount: Deactivated successfully. Mar 25 01:37:09.678563 containerd[1737]: time="2025-03-25T01:37:09.678305724Z" level=info msg="CreateContainer within sandbox \"f4d2d37f71e1fe93778b26f6cbacdd1b9d24644b190ed3e2cbad7403a4db9767\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd\"" Mar 25 01:37:09.680506 containerd[1737]: time="2025-03-25T01:37:09.679301432Z" level=info msg="StartContainer for \"d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd\"" Mar 25 01:37:09.681116 containerd[1737]: time="2025-03-25T01:37:09.680505542Z" level=info msg="connecting to shim d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd" address="unix:///run/containerd/s/d37bb22d6b0977f241073de87010ef2f24b8eb00edc0b995f76f3c877cf48683" protocol=ttrpc version=3 Mar 25 01:37:09.708906 systemd[1]: Started cri-containerd-d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd.scope - libcontainer container d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd. Mar 25 01:37:09.769335 containerd[1737]: time="2025-03-25T01:37:09.769288767Z" level=info msg="StartContainer for \"d58a930e7521fcc6c446f4df79d03f170e782c261563d1b289dd621c5dfb32dd\" returns successfully" Mar 25 01:37:10.049350 containerd[1737]: time="2025-03-25T01:37:10.049296753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:10.051353 containerd[1737]: time="2025-03-25T01:37:10.051243369Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 25 01:37:10.055107 containerd[1737]: time="2025-03-25T01:37:10.055044400Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:10.058551 containerd[1737]: time="2025-03-25T01:37:10.058473728Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:10.059506 containerd[1737]: time="2025-03-25T01:37:10.059057433Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 3.602845428s" Mar 25 01:37:10.059506 containerd[1737]: time="2025-03-25T01:37:10.059094033Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:37:10.060750 containerd[1737]: time="2025-03-25T01:37:10.060730947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:37:10.062045 containerd[1737]: time="2025-03-25T01:37:10.062017857Z" level=info msg="CreateContainer within sandbox \"52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:37:10.087688 containerd[1737]: time="2025-03-25T01:37:10.087635366Z" level=info msg="Container 8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:10.088834 systemd-networkd[1573]: cali858f58d36dd: Gained IPv6LL Mar 25 01:37:10.107221 containerd[1737]: time="2025-03-25T01:37:10.107185826Z" level=info msg="CreateContainer within sandbox \"52800d085280ca8d16ff50ff1e0e16d90ebe204c092008fec6d8c78d03cd4a92\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe\"" Mar 25 01:37:10.107709 containerd[1737]: time="2025-03-25T01:37:10.107580329Z" level=info msg="StartContainer for \"8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe\"" Mar 25 01:37:10.109489 containerd[1737]: time="2025-03-25T01:37:10.109265543Z" level=info msg="connecting to shim 8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe" address="unix:///run/containerd/s/e3e78dcb778d069ff5984e432a9fe98ca085a15f15fffdf7d2f726e8157518ed" protocol=ttrpc version=3 Mar 25 01:37:10.136642 systemd[1]: Started cri-containerd-8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe.scope - libcontainer container 8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe. Mar 25 01:37:10.190934 containerd[1737]: time="2025-03-25T01:37:10.190814409Z" level=info msg="StartContainer for \"8913b7e2934196a884237bdac40b126d777995af293cc3c5c512ea3edde749fe\" returns successfully" Mar 25 01:37:10.349581 kubelet[3269]: I0325 01:37:10.347889 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-664776b495-tr6kg" podStartSLOduration=26.742136935 podStartE2EDuration="30.347853191s" podCreationTimestamp="2025-03-25 01:36:40 +0000 UTC" firstStartedPulling="2025-03-25 01:37:06.454168184 +0000 UTC m=+38.452585635" lastFinishedPulling="2025-03-25 01:37:10.05988434 +0000 UTC m=+42.058301891" observedRunningTime="2025-03-25 01:37:10.326663718 +0000 UTC m=+42.325081169" watchObservedRunningTime="2025-03-25 01:37:10.347853191 +0000 UTC m=+42.346270642" Mar 25 01:37:11.112639 systemd-networkd[1573]: cali4d3e4600d54: Gained IPv6LL Mar 25 01:37:11.565388 kubelet[3269]: I0325 01:37:11.563178 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-dhkch" podStartSLOduration=38.563158914 podStartE2EDuration="38.563158914s" podCreationTimestamp="2025-03-25 01:36:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:37:10.350106109 +0000 UTC m=+42.348523560" watchObservedRunningTime="2025-03-25 01:37:11.563158914 +0000 UTC m=+43.561576465" Mar 25 01:37:11.681232 containerd[1737]: time="2025-03-25T01:37:11.681176078Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:11.684912 containerd[1737]: time="2025-03-25T01:37:11.684842908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 25 01:37:11.689988 containerd[1737]: time="2025-03-25T01:37:11.689950349Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:11.695710 containerd[1737]: time="2025-03-25T01:37:11.695674896Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:11.697634 containerd[1737]: time="2025-03-25T01:37:11.697493811Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.636589663s" Mar 25 01:37:11.697634 containerd[1737]: time="2025-03-25T01:37:11.697531211Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 25 01:37:11.700070 containerd[1737]: time="2025-03-25T01:37:11.699687829Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:37:11.701599 containerd[1737]: time="2025-03-25T01:37:11.700886639Z" level=info msg="CreateContainer within sandbox \"036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:37:11.723712 containerd[1737]: time="2025-03-25T01:37:11.723672525Z" level=info msg="Container a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:11.746299 containerd[1737]: time="2025-03-25T01:37:11.746216009Z" level=info msg="CreateContainer within sandbox \"036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9\"" Mar 25 01:37:11.747351 containerd[1737]: time="2025-03-25T01:37:11.746720413Z" level=info msg="StartContainer for \"a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9\"" Mar 25 01:37:11.748769 containerd[1737]: time="2025-03-25T01:37:11.748542428Z" level=info msg="connecting to shim a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9" address="unix:///run/containerd/s/a48df458173002e1b5618773fb1a342206fd980e8b4e17350dca38af8b62ff57" protocol=ttrpc version=3 Mar 25 01:37:11.774640 systemd[1]: Started cri-containerd-a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9.scope - libcontainer container a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9. Mar 25 01:37:11.816777 containerd[1737]: time="2025-03-25T01:37:11.816534783Z" level=info msg="StartContainer for \"a7ae559c6b4a053de0392290e67451eb774d792652185685d8a59cfd04309fa9\" returns successfully" Mar 25 01:37:13.965954 containerd[1737]: time="2025-03-25T01:37:13.965122221Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:13.969228 containerd[1737]: time="2025-03-25T01:37:13.969164363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 25 01:37:13.973813 containerd[1737]: time="2025-03-25T01:37:13.973782111Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:13.981008 containerd[1737]: time="2025-03-25T01:37:13.980978286Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:13.983543 containerd[1737]: time="2025-03-25T01:37:13.982926206Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 2.283207077s" Mar 25 01:37:13.983727 containerd[1737]: time="2025-03-25T01:37:13.983689114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 25 01:37:13.985662 containerd[1737]: time="2025-03-25T01:37:13.985626534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:37:14.012899 containerd[1737]: time="2025-03-25T01:37:14.012772115Z" level=info msg="CreateContainer within sandbox \"9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:37:14.035862 containerd[1737]: time="2025-03-25T01:37:14.035815554Z" level=info msg="Container 6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:14.044517 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3683543961.mount: Deactivated successfully. Mar 25 01:37:14.053965 containerd[1737]: time="2025-03-25T01:37:14.053924041Z" level=info msg="CreateContainer within sandbox \"9ba69c611974e951bf03da388a1f7a2fd6975ee4979d3a4cf76f06a70b8b367b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\"" Mar 25 01:37:14.054559 containerd[1737]: time="2025-03-25T01:37:14.054512847Z" level=info msg="StartContainer for \"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\"" Mar 25 01:37:14.057328 containerd[1737]: time="2025-03-25T01:37:14.057298276Z" level=info msg="connecting to shim 6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79" address="unix:///run/containerd/s/79d9f05b04979fbceefdb9ad4b19c1bc4ca5d6ccd299c55a482b2653cdb5417d" protocol=ttrpc version=3 Mar 25 01:37:14.086673 systemd[1]: Started cri-containerd-6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79.scope - libcontainer container 6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79. Mar 25 01:37:14.138319 containerd[1737]: time="2025-03-25T01:37:14.138273715Z" level=info msg="StartContainer for \"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" returns successfully" Mar 25 01:37:14.366065 containerd[1737]: time="2025-03-25T01:37:14.365931773Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:14.371334 containerd[1737]: time="2025-03-25T01:37:14.370389820Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:37:14.377333 containerd[1737]: time="2025-03-25T01:37:14.377145790Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 391.334953ms" Mar 25 01:37:14.377333 containerd[1737]: time="2025-03-25T01:37:14.377309891Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:37:14.380962 containerd[1737]: time="2025-03-25T01:37:14.380931229Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:37:14.383811 containerd[1737]: time="2025-03-25T01:37:14.383768658Z" level=info msg="CreateContainer within sandbox \"2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:37:14.413675 containerd[1737]: time="2025-03-25T01:37:14.413630767Z" level=info msg="Container 103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:14.414945 containerd[1737]: time="2025-03-25T01:37:14.414787479Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"5c67b7fb23a2f2777f3a03bee552d0d0f1e458b08beb28c091b609948615d85e\" pid:5253 exited_at:{seconds:1742866634 nanos:414334675}" Mar 25 01:37:14.432807 kubelet[3269]: I0325 01:37:14.432514 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-77fc9c9b48-fjbw6" podStartSLOduration=29.434719117 podStartE2EDuration="34.432492163s" podCreationTimestamp="2025-03-25 01:36:40 +0000 UTC" firstStartedPulling="2025-03-25 01:37:08.987161781 +0000 UTC m=+40.985579232" lastFinishedPulling="2025-03-25 01:37:13.984934827 +0000 UTC m=+45.983352278" observedRunningTime="2025-03-25 01:37:14.349580904 +0000 UTC m=+46.347998455" watchObservedRunningTime="2025-03-25 01:37:14.432492163 +0000 UTC m=+46.430909614" Mar 25 01:37:14.437560 containerd[1737]: time="2025-03-25T01:37:14.437516315Z" level=info msg="CreateContainer within sandbox \"2b8e0fbfe2d1fadb9b27ede259a6fac4fd407f20c7fa63e0e435274190a931e8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f\"" Mar 25 01:37:14.439568 containerd[1737]: time="2025-03-25T01:37:14.438686327Z" level=info msg="StartContainer for \"103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f\"" Mar 25 01:37:14.440384 containerd[1737]: time="2025-03-25T01:37:14.440339744Z" level=info msg="connecting to shim 103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f" address="unix:///run/containerd/s/34a20c2be0863fe7ea52f346e167771ca49772bc8b6aed6d3eb714d1f827de5d" protocol=ttrpc version=3 Mar 25 01:37:14.466636 systemd[1]: Started cri-containerd-103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f.scope - libcontainer container 103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f. Mar 25 01:37:14.525513 containerd[1737]: time="2025-03-25T01:37:14.525371925Z" level=info msg="StartContainer for \"103c437395365f69c0a7bd8f54185d28562d850a611959fe556f5f6cff66d73f\" returns successfully" Mar 25 01:37:15.350950 kubelet[3269]: I0325 01:37:15.350876 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-664776b495-wcxlr" podStartSLOduration=29.974058658 podStartE2EDuration="35.350854164s" podCreationTimestamp="2025-03-25 01:36:40 +0000 UTC" firstStartedPulling="2025-03-25 01:37:09.002582207 +0000 UTC m=+41.000999658" lastFinishedPulling="2025-03-25 01:37:14.379377713 +0000 UTC m=+46.377795164" observedRunningTime="2025-03-25 01:37:15.350354659 +0000 UTC m=+47.348772110" watchObservedRunningTime="2025-03-25 01:37:15.350854164 +0000 UTC m=+47.349271915" Mar 25 01:37:16.228620 containerd[1737]: time="2025-03-25T01:37:16.228569435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:16.231326 containerd[1737]: time="2025-03-25T01:37:16.231258762Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 25 01:37:16.235922 containerd[1737]: time="2025-03-25T01:37:16.235883310Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:16.242513 containerd[1737]: time="2025-03-25T01:37:16.241443168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:37:16.242513 containerd[1737]: time="2025-03-25T01:37:16.242351677Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 1.861379048s" Mar 25 01:37:16.242513 containerd[1737]: time="2025-03-25T01:37:16.242385377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 25 01:37:16.247093 containerd[1737]: time="2025-03-25T01:37:16.247056726Z" level=info msg="CreateContainer within sandbox \"036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:37:16.271514 containerd[1737]: time="2025-03-25T01:37:16.271460278Z" level=info msg="Container b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:37:16.293422 containerd[1737]: time="2025-03-25T01:37:16.293377804Z" level=info msg="CreateContainer within sandbox \"036b72d1381d20a53be8b008eba3c414eb26ca753af33ae41ebee50ba040a411\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9\"" Mar 25 01:37:16.295058 containerd[1737]: time="2025-03-25T01:37:16.293854109Z" level=info msg="StartContainer for \"b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9\"" Mar 25 01:37:16.295660 containerd[1737]: time="2025-03-25T01:37:16.295576527Z" level=info msg="connecting to shim b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9" address="unix:///run/containerd/s/a48df458173002e1b5618773fb1a342206fd980e8b4e17350dca38af8b62ff57" protocol=ttrpc version=3 Mar 25 01:37:16.320649 systemd[1]: Started cri-containerd-b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9.scope - libcontainer container b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9. Mar 25 01:37:16.395919 containerd[1737]: time="2025-03-25T01:37:16.395718962Z" level=info msg="StartContainer for \"b91832b880df97d301f9ee44e8cbeba8448b787d07b5d38511d8fe23ac53ead9\" returns successfully" Mar 25 01:37:17.196261 kubelet[3269]: I0325 01:37:17.196223 3269 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:37:17.196261 kubelet[3269]: I0325 01:37:17.196261 3269 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:37:17.371498 kubelet[3269]: I0325 01:37:17.371420 3269 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-szpdf" podStartSLOduration=28.539924274 podStartE2EDuration="37.371397145s" podCreationTimestamp="2025-03-25 01:36:40 +0000 UTC" firstStartedPulling="2025-03-25 01:37:07.412580324 +0000 UTC m=+39.410997775" lastFinishedPulling="2025-03-25 01:37:16.244053195 +0000 UTC m=+48.242470646" observedRunningTime="2025-03-25 01:37:17.370113832 +0000 UTC m=+49.368531283" watchObservedRunningTime="2025-03-25 01:37:17.371397145 +0000 UTC m=+49.369814696" Mar 25 01:37:37.506553 containerd[1737]: time="2025-03-25T01:37:37.506501358Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" id:\"b4877e599ca223506dee4d963054a18a1ae0a5ae328e2eb84f2c865e2cf70b58\" pid:5371 exited_at:{seconds:1742866657 nanos:505923852}" Mar 25 01:37:44.370160 containerd[1737]: time="2025-03-25T01:37:44.370105311Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"f3e5ab859a853f069a52222143c5e9478e334da7fda0fbae03978f070d041314\" pid:5397 exited_at:{seconds:1742866664 nanos:369709807}" Mar 25 01:37:51.393547 containerd[1737]: time="2025-03-25T01:37:51.392042144Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"ef68c68439b80eca5afbfed4111fc250e75458bd8fd7ff05f6c34c7dfcba135e\" pid:5426 exited_at:{seconds:1742866671 nanos:391740141}" Mar 25 01:38:01.651402 waagent[1950]: 2025-03-25T01:38:01.651335Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 2] Mar 25 01:38:01.660824 waagent[1950]: 2025-03-25T01:38:01.660775Z INFO ExtHandler Mar 25 01:38:01.660944 waagent[1950]: 2025-03-25T01:38:01.660880Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 33fc2a7a-8bf3-4a58-a8a6-9ea2e23b3859 eTag: 1331722098310320389 source: Fabric] Mar 25 01:38:01.661261 waagent[1950]: 2025-03-25T01:38:01.661213Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 25 01:38:01.661846 waagent[1950]: 2025-03-25T01:38:01.661795Z INFO ExtHandler Mar 25 01:38:01.661926 waagent[1950]: 2025-03-25T01:38:01.661879Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 2] Mar 25 01:38:01.731741 waagent[1950]: 2025-03-25T01:38:01.731683Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 25 01:38:01.797700 waagent[1950]: 2025-03-25T01:38:01.797632Z INFO ExtHandler Downloaded certificate {'thumbprint': 'EB2A6C7C6ABD7010BD4D81056442C85A10ABABD1', 'hasPrivateKey': True} Mar 25 01:38:01.798076 waagent[1950]: 2025-03-25T01:38:01.798032Z INFO ExtHandler Downloaded certificate {'thumbprint': '271DD035DC2A91B251141A7A337E1BBB9E3A36DF', 'hasPrivateKey': False} Mar 25 01:38:01.798521 waagent[1950]: 2025-03-25T01:38:01.798433Z INFO ExtHandler Fetch goal state completed Mar 25 01:38:01.798910 waagent[1950]: 2025-03-25T01:38:01.798862Z INFO ExtHandler ExtHandler Mar 25 01:38:01.798988 waagent[1950]: 2025-03-25T01:38:01.798942Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_2 channel: WireServer source: Fabric activity: 49799f20-ea07-411d-8b29-364fcca4de88 correlation cb2bf197-9887-4ff4-ac1c-7d47cb93b1d3 created: 2025-03-25T01:37:53.456871Z] Mar 25 01:38:01.799287 waagent[1950]: 2025-03-25T01:38:01.799244Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 25 01:38:01.800131 waagent[1950]: 2025-03-25T01:38:01.800014Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_2 1 ms] Mar 25 01:38:07.512999 containerd[1737]: time="2025-03-25T01:38:07.512941915Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" id:\"56fcaa1b6414d1c2fba33f68a0593d00f68b9c2a99a5464106099c1239322930\" pid:5456 exited_at:{seconds:1742866687 nanos:512578812}" Mar 25 01:38:09.740692 systemd[1]: Started sshd@7-10.200.8.14:22-10.200.16.10:57114.service - OpenSSH per-connection server daemon (10.200.16.10:57114). Mar 25 01:38:10.377975 sshd[5472]: Accepted publickey for core from 10.200.16.10 port 57114 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:10.379509 sshd-session[5472]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:10.384807 systemd-logind[1706]: New session 10 of user core. Mar 25 01:38:10.388642 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:38:10.896079 sshd[5474]: Connection closed by 10.200.16.10 port 57114 Mar 25 01:38:10.896952 sshd-session[5472]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:10.900671 systemd[1]: sshd@7-10.200.8.14:22-10.200.16.10:57114.service: Deactivated successfully. Mar 25 01:38:10.903052 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:38:10.904910 systemd-logind[1706]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:38:10.905910 systemd-logind[1706]: Removed session 10. Mar 25 01:38:14.392154 containerd[1737]: time="2025-03-25T01:38:14.392106586Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"9d98c4a4a1c558818adbd0e0a4b013e00cf0002aa4296c25bb7369d45efe2893\" pid:5499 exited_at:{seconds:1742866694 nanos:391339378}" Mar 25 01:38:16.008686 systemd[1]: Started sshd@8-10.200.8.14:22-10.200.16.10:57130.service - OpenSSH per-connection server daemon (10.200.16.10:57130). Mar 25 01:38:16.645639 sshd[5509]: Accepted publickey for core from 10.200.16.10 port 57130 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:16.647145 sshd-session[5509]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:16.651782 systemd-logind[1706]: New session 11 of user core. Mar 25 01:38:16.659668 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:38:17.151881 sshd[5511]: Connection closed by 10.200.16.10 port 57130 Mar 25 01:38:17.153814 sshd-session[5509]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:17.158047 systemd[1]: sshd@8-10.200.8.14:22-10.200.16.10:57130.service: Deactivated successfully. Mar 25 01:38:17.160380 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:38:17.161200 systemd-logind[1706]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:38:17.162372 systemd-logind[1706]: Removed session 11. Mar 25 01:38:22.268822 systemd[1]: Started sshd@9-10.200.8.14:22-10.200.16.10:33152.service - OpenSSH per-connection server daemon (10.200.16.10:33152). Mar 25 01:38:22.907386 sshd[5524]: Accepted publickey for core from 10.200.16.10 port 33152 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:22.909011 sshd-session[5524]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:22.913409 systemd-logind[1706]: New session 12 of user core. Mar 25 01:38:22.919655 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:38:23.411129 sshd[5526]: Connection closed by 10.200.16.10 port 33152 Mar 25 01:38:23.411996 sshd-session[5524]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:23.416437 systemd[1]: sshd@9-10.200.8.14:22-10.200.16.10:33152.service: Deactivated successfully. Mar 25 01:38:23.419221 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:38:23.420286 systemd-logind[1706]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:38:23.421307 systemd-logind[1706]: Removed session 12. Mar 25 01:38:23.528944 systemd[1]: Started sshd@10-10.200.8.14:22-10.200.16.10:33166.service - OpenSSH per-connection server daemon (10.200.16.10:33166). Mar 25 01:38:24.165256 sshd[5539]: Accepted publickey for core from 10.200.16.10 port 33166 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:24.166731 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:24.172666 systemd-logind[1706]: New session 13 of user core. Mar 25 01:38:24.177667 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:38:24.698661 sshd[5541]: Connection closed by 10.200.16.10 port 33166 Mar 25 01:38:24.699609 sshd-session[5539]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:24.702803 systemd[1]: sshd@10-10.200.8.14:22-10.200.16.10:33166.service: Deactivated successfully. Mar 25 01:38:24.705224 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:38:24.706952 systemd-logind[1706]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:38:24.708281 systemd-logind[1706]: Removed session 13. Mar 25 01:38:24.809633 systemd[1]: Started sshd@11-10.200.8.14:22-10.200.16.10:33182.service - OpenSSH per-connection server daemon (10.200.16.10:33182). Mar 25 01:38:25.448248 sshd[5551]: Accepted publickey for core from 10.200.16.10 port 33182 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:25.449735 sshd-session[5551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:25.454013 systemd-logind[1706]: New session 14 of user core. Mar 25 01:38:25.459668 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:38:25.964461 sshd[5553]: Connection closed by 10.200.16.10 port 33182 Mar 25 01:38:25.965299 sshd-session[5551]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:25.970021 systemd[1]: sshd@11-10.200.8.14:22-10.200.16.10:33182.service: Deactivated successfully. Mar 25 01:38:25.972397 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:38:25.973569 systemd-logind[1706]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:38:25.974855 systemd-logind[1706]: Removed session 14. Mar 25 01:38:31.076600 systemd[1]: Started sshd@12-10.200.8.14:22-10.200.16.10:59168.service - OpenSSH per-connection server daemon (10.200.16.10:59168). Mar 25 01:38:31.719159 sshd[5579]: Accepted publickey for core from 10.200.16.10 port 59168 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:31.721277 sshd-session[5579]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:31.729126 systemd-logind[1706]: New session 15 of user core. Mar 25 01:38:31.734681 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:38:32.225758 sshd[5581]: Connection closed by 10.200.16.10 port 59168 Mar 25 01:38:32.227301 sshd-session[5579]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:32.230120 systemd[1]: sshd@12-10.200.8.14:22-10.200.16.10:59168.service: Deactivated successfully. Mar 25 01:38:32.232459 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:38:32.234215 systemd-logind[1706]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:38:32.235278 systemd-logind[1706]: Removed session 15. Mar 25 01:38:37.341876 systemd[1]: Started sshd@13-10.200.8.14:22-10.200.16.10:59172.service - OpenSSH per-connection server daemon (10.200.16.10:59172). Mar 25 01:38:37.508906 containerd[1737]: time="2025-03-25T01:38:37.508840630Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" id:\"33a74c43ee1a26405d8eed814bdf9739e7a35cb2436618df5aef7331ee90043e\" pid:5614 exited_at:{seconds:1742866717 nanos:508416025}" Mar 25 01:38:37.982025 sshd[5595]: Accepted publickey for core from 10.200.16.10 port 59172 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:37.983586 sshd-session[5595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:37.988101 systemd-logind[1706]: New session 16 of user core. Mar 25 01:38:37.995652 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:38:38.490729 sshd[5626]: Connection closed by 10.200.16.10 port 59172 Mar 25 01:38:38.491715 sshd-session[5595]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:38.496164 systemd[1]: sshd@13-10.200.8.14:22-10.200.16.10:59172.service: Deactivated successfully. Mar 25 01:38:38.498529 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:38:38.499657 systemd-logind[1706]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:38:38.501264 systemd-logind[1706]: Removed session 16. Mar 25 01:38:43.606858 systemd[1]: Started sshd@14-10.200.8.14:22-10.200.16.10:43182.service - OpenSSH per-connection server daemon (10.200.16.10:43182). Mar 25 01:38:44.243195 sshd[5651]: Accepted publickey for core from 10.200.16.10 port 43182 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:44.244667 sshd-session[5651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:44.249205 systemd-logind[1706]: New session 17 of user core. Mar 25 01:38:44.252641 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:38:44.373580 containerd[1737]: time="2025-03-25T01:38:44.373271876Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"ebc9cb661aa2bcf2e539c0690285dfcc72f8044a785f3fa7a6884b204a5299f1\" pid:5666 exited_at:{seconds:1742866724 nanos:372555468}" Mar 25 01:38:44.785047 sshd[5653]: Connection closed by 10.200.16.10 port 43182 Mar 25 01:38:44.786117 sshd-session[5651]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:44.789347 systemd[1]: sshd@14-10.200.8.14:22-10.200.16.10:43182.service: Deactivated successfully. Mar 25 01:38:44.791838 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:38:44.793403 systemd-logind[1706]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:38:44.794794 systemd-logind[1706]: Removed session 17. Mar 25 01:38:44.900896 systemd[1]: Started sshd@15-10.200.8.14:22-10.200.16.10:43196.service - OpenSSH per-connection server daemon (10.200.16.10:43196). Mar 25 01:38:45.548317 sshd[5685]: Accepted publickey for core from 10.200.16.10 port 43196 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:45.549822 sshd-session[5685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:45.554189 systemd-logind[1706]: New session 18 of user core. Mar 25 01:38:45.562626 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:38:46.111609 sshd[5687]: Connection closed by 10.200.16.10 port 43196 Mar 25 01:38:46.112446 sshd-session[5685]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:46.116286 systemd[1]: sshd@15-10.200.8.14:22-10.200.16.10:43196.service: Deactivated successfully. Mar 25 01:38:46.118314 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:38:46.119092 systemd-logind[1706]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:38:46.120100 systemd-logind[1706]: Removed session 18. Mar 25 01:38:46.223872 systemd[1]: Started sshd@16-10.200.8.14:22-10.200.16.10:43208.service - OpenSSH per-connection server daemon (10.200.16.10:43208). Mar 25 01:38:46.858210 sshd[5697]: Accepted publickey for core from 10.200.16.10 port 43208 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:46.859934 sshd-session[5697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:46.864607 systemd-logind[1706]: New session 19 of user core. Mar 25 01:38:46.872664 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:38:48.178499 sshd[5699]: Connection closed by 10.200.16.10 port 43208 Mar 25 01:38:48.179350 sshd-session[5697]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:48.183359 systemd[1]: sshd@16-10.200.8.14:22-10.200.16.10:43208.service: Deactivated successfully. Mar 25 01:38:48.185670 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:38:48.186457 systemd-logind[1706]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:38:48.187885 systemd-logind[1706]: Removed session 19. Mar 25 01:38:48.288974 systemd[1]: Started sshd@17-10.200.8.14:22-10.200.16.10:43220.service - OpenSSH per-connection server daemon (10.200.16.10:43220). Mar 25 01:38:48.924458 sshd[5716]: Accepted publickey for core from 10.200.16.10 port 43220 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:48.925958 sshd-session[5716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:48.931136 systemd-logind[1706]: New session 20 of user core. Mar 25 01:38:48.935647 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:38:49.535831 sshd[5718]: Connection closed by 10.200.16.10 port 43220 Mar 25 01:38:49.536863 sshd-session[5716]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:49.541155 systemd[1]: sshd@17-10.200.8.14:22-10.200.16.10:43220.service: Deactivated successfully. Mar 25 01:38:49.543310 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:38:49.544262 systemd-logind[1706]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:38:49.545393 systemd-logind[1706]: Removed session 20. Mar 25 01:38:49.663116 systemd[1]: Started sshd@18-10.200.8.14:22-10.200.16.10:48734.service - OpenSSH per-connection server daemon (10.200.16.10:48734). Mar 25 01:38:50.304921 sshd[5728]: Accepted publickey for core from 10.200.16.10 port 48734 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:50.306343 sshd-session[5728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:50.310581 systemd-logind[1706]: New session 21 of user core. Mar 25 01:38:50.315816 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:38:50.808207 sshd[5730]: Connection closed by 10.200.16.10 port 48734 Mar 25 01:38:50.809070 sshd-session[5728]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:50.811983 systemd[1]: sshd@18-10.200.8.14:22-10.200.16.10:48734.service: Deactivated successfully. Mar 25 01:38:50.814381 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:38:50.816197 systemd-logind[1706]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:38:50.817330 systemd-logind[1706]: Removed session 21. Mar 25 01:38:51.393791 containerd[1737]: time="2025-03-25T01:38:51.393738101Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"d1ba62aeef9801685a9e3c352916216d6e88fc8ce6d7a162e55693d3f8f9de0a\" pid:5751 exited_at:{seconds:1742866731 nanos:393383697}" Mar 25 01:38:55.924392 systemd[1]: Started sshd@19-10.200.8.14:22-10.200.16.10:48744.service - OpenSSH per-connection server daemon (10.200.16.10:48744). Mar 25 01:38:56.559930 sshd[5763]: Accepted publickey for core from 10.200.16.10 port 48744 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:38:56.561403 sshd-session[5763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:38:56.565814 systemd-logind[1706]: New session 22 of user core. Mar 25 01:38:56.571650 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 01:38:57.062928 sshd[5765]: Connection closed by 10.200.16.10 port 48744 Mar 25 01:38:57.063739 sshd-session[5763]: pam_unix(sshd:session): session closed for user core Mar 25 01:38:57.066769 systemd[1]: sshd@19-10.200.8.14:22-10.200.16.10:48744.service: Deactivated successfully. Mar 25 01:38:57.069268 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 01:38:57.070799 systemd-logind[1706]: Session 22 logged out. Waiting for processes to exit. Mar 25 01:38:57.071748 systemd-logind[1706]: Removed session 22. Mar 25 01:39:02.179136 systemd[1]: Started sshd@20-10.200.8.14:22-10.200.16.10:48050.service - OpenSSH per-connection server daemon (10.200.16.10:48050). Mar 25 01:39:02.818369 sshd[5777]: Accepted publickey for core from 10.200.16.10 port 48050 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:39:02.819874 sshd-session[5777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:39:02.824123 systemd-logind[1706]: New session 23 of user core. Mar 25 01:39:02.833891 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 01:39:03.324623 sshd[5779]: Connection closed by 10.200.16.10 port 48050 Mar 25 01:39:03.325433 sshd-session[5777]: pam_unix(sshd:session): session closed for user core Mar 25 01:39:03.330069 systemd[1]: sshd@20-10.200.8.14:22-10.200.16.10:48050.service: Deactivated successfully. Mar 25 01:39:03.332473 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 01:39:03.333402 systemd-logind[1706]: Session 23 logged out. Waiting for processes to exit. Mar 25 01:39:03.334474 systemd-logind[1706]: Removed session 23. Mar 25 01:39:07.506068 containerd[1737]: time="2025-03-25T01:39:07.506000183Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec0f1985f9aca7ebbe181a457a3a4d945592c40450f87c48560cd363207987d4\" id:\"ff35c1a2332db4b184423b1a0dc9d3c6d1d7e77c07247ce40c286971bed9deed\" pid:5804 exited_at:{seconds:1742866747 nanos:505612479}" Mar 25 01:39:08.442747 systemd[1]: Started sshd@21-10.200.8.14:22-10.200.16.10:48066.service - OpenSSH per-connection server daemon (10.200.16.10:48066). Mar 25 01:39:09.078416 sshd[5817]: Accepted publickey for core from 10.200.16.10 port 48066 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:39:09.080029 sshd-session[5817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:39:09.084539 systemd-logind[1706]: New session 24 of user core. Mar 25 01:39:09.086693 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 25 01:39:09.584006 sshd[5819]: Connection closed by 10.200.16.10 port 48066 Mar 25 01:39:09.584889 sshd-session[5817]: pam_unix(sshd:session): session closed for user core Mar 25 01:39:09.588346 systemd[1]: sshd@21-10.200.8.14:22-10.200.16.10:48066.service: Deactivated successfully. Mar 25 01:39:09.590631 systemd[1]: session-24.scope: Deactivated successfully. Mar 25 01:39:09.592678 systemd-logind[1706]: Session 24 logged out. Waiting for processes to exit. Mar 25 01:39:09.593830 systemd-logind[1706]: Removed session 24. Mar 25 01:39:14.372528 containerd[1737]: time="2025-03-25T01:39:14.372462871Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6f3eb353be8d8c07ab59b38fa53ba922e5169ead3383449b2aba4c6039d16e79\" id:\"7fca6dc9ee1a2d3c350df239ae586bd2a89cc8bb4869b4a72d49bb7ceea419a5\" pid:5842 exited_at:{seconds:1742866754 nanos:372168068}" Mar 25 01:39:14.700770 systemd[1]: Started sshd@22-10.200.8.14:22-10.200.16.10:37162.service - OpenSSH per-connection server daemon (10.200.16.10:37162). Mar 25 01:39:15.352282 sshd[5852]: Accepted publickey for core from 10.200.16.10 port 37162 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:39:15.351879 sshd-session[5852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:39:15.358162 systemd-logind[1706]: New session 25 of user core. Mar 25 01:39:15.366643 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 25 01:39:15.859865 sshd[5854]: Connection closed by 10.200.16.10 port 37162 Mar 25 01:39:15.860755 sshd-session[5852]: pam_unix(sshd:session): session closed for user core Mar 25 01:39:15.864321 systemd[1]: sshd@22-10.200.8.14:22-10.200.16.10:37162.service: Deactivated successfully. Mar 25 01:39:15.866834 systemd[1]: session-25.scope: Deactivated successfully. Mar 25 01:39:15.868464 systemd-logind[1706]: Session 25 logged out. Waiting for processes to exit. Mar 25 01:39:15.869675 systemd-logind[1706]: Removed session 25. Mar 25 01:39:20.972808 systemd[1]: Started sshd@23-10.200.8.14:22-10.200.16.10:42272.service - OpenSSH per-connection server daemon (10.200.16.10:42272). Mar 25 01:39:21.608608 sshd[5865]: Accepted publickey for core from 10.200.16.10 port 42272 ssh2: RSA SHA256:yvM9aJCEcWMwwpyRstQ24Z65MqryworXgmyV3HoKOoA Mar 25 01:39:21.610024 sshd-session[5865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:39:21.614319 systemd-logind[1706]: New session 26 of user core. Mar 25 01:39:21.621638 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 25 01:39:22.111725 sshd[5867]: Connection closed by 10.200.16.10 port 42272 Mar 25 01:39:22.112426 sshd-session[5865]: pam_unix(sshd:session): session closed for user core Mar 25 01:39:22.115347 systemd[1]: sshd@23-10.200.8.14:22-10.200.16.10:42272.service: Deactivated successfully. Mar 25 01:39:22.117710 systemd[1]: session-26.scope: Deactivated successfully. Mar 25 01:39:22.119343 systemd-logind[1706]: Session 26 logged out. Waiting for processes to exit. Mar 25 01:39:22.120592 systemd-logind[1706]: Removed session 26.