Mar 25 01:31:50.012248 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 24 23:38:35 -00 2025 Mar 25 01:31:50.012298 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:31:50.012313 kernel: BIOS-provided physical RAM map: Mar 25 01:31:50.012323 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 25 01:31:50.012337 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 25 01:31:50.012347 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 25 01:31:50.012358 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 25 01:31:50.012368 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 25 01:31:50.012378 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 25 01:31:50.012388 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 25 01:31:50.012398 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 25 01:31:50.012407 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 25 01:31:50.012422 kernel: NX (Execute Disable) protection: active Mar 25 01:31:50.012432 kernel: APIC: Static calls initialized Mar 25 01:31:50.012444 kernel: SMBIOS 2.8 present. Mar 25 01:31:50.012455 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.16.0-3.module_el8.7.0+3346+68867adb 04/01/2014 Mar 25 01:31:50.012466 kernel: Hypervisor detected: KVM Mar 25 01:31:50.012476 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 25 01:31:50.012492 kernel: kvm-clock: using sched offset of 4639697347 cycles Mar 25 01:31:50.012503 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 25 01:31:50.012514 kernel: tsc: Detected 2799.998 MHz processor Mar 25 01:31:50.012526 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 25 01:31:50.012537 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 25 01:31:50.012547 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 25 01:31:50.012558 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 25 01:31:50.012568 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 25 01:31:50.012579 kernel: Using GB pages for direct mapping Mar 25 01:31:50.012595 kernel: ACPI: Early table checksum verification disabled Mar 25 01:31:50.012605 kernel: ACPI: RSDP 0x00000000000F59E0 000014 (v00 BOCHS ) Mar 25 01:31:50.012616 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012627 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012638 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012648 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 25 01:31:50.012659 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012670 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012680 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012696 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:31:50.012707 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 25 01:31:50.012718 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 25 01:31:50.012728 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 25 01:31:50.012745 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 25 01:31:50.012756 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 25 01:31:50.012772 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 25 01:31:50.012783 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 25 01:31:50.012795 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 25 01:31:50.012806 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 25 01:31:50.012817 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 25 01:31:50.012828 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Mar 25 01:31:50.012839 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 25 01:31:50.012849 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Mar 25 01:31:50.012865 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 25 01:31:50.012876 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Mar 25 01:31:50.012887 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 25 01:31:50.014030 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Mar 25 01:31:50.014043 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 25 01:31:50.014054 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Mar 25 01:31:50.014065 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 25 01:31:50.014076 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Mar 25 01:31:50.014087 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 25 01:31:50.014098 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Mar 25 01:31:50.014116 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 25 01:31:50.014128 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 25 01:31:50.014139 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 25 01:31:50.014150 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Mar 25 01:31:50.014162 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Mar 25 01:31:50.014173 kernel: Zone ranges: Mar 25 01:31:50.014184 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 25 01:31:50.014195 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 25 01:31:50.014206 kernel: Normal empty Mar 25 01:31:50.014222 kernel: Movable zone start for each node Mar 25 01:31:50.014234 kernel: Early memory node ranges Mar 25 01:31:50.014245 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 25 01:31:50.014256 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 25 01:31:50.014267 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 25 01:31:50.014278 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 25 01:31:50.014289 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 25 01:31:50.014300 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 25 01:31:50.014311 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 25 01:31:50.014327 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 25 01:31:50.014339 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 25 01:31:50.014350 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 25 01:31:50.014361 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 25 01:31:50.014372 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 25 01:31:50.014383 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 25 01:31:50.014394 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 25 01:31:50.014405 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 25 01:31:50.014416 kernel: TSC deadline timer available Mar 25 01:31:50.014431 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Mar 25 01:31:50.014443 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 25 01:31:50.014454 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 25 01:31:50.014465 kernel: Booting paravirtualized kernel on KVM Mar 25 01:31:50.014476 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 25 01:31:50.014487 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 25 01:31:50.014498 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Mar 25 01:31:50.014510 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Mar 25 01:31:50.014520 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 25 01:31:50.014536 kernel: kvm-guest: PV spinlocks enabled Mar 25 01:31:50.014548 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 25 01:31:50.014561 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:31:50.014573 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:31:50.014584 kernel: random: crng init done Mar 25 01:31:50.014595 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:31:50.014606 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 25 01:31:50.014617 kernel: Fallback order for Node 0: 0 Mar 25 01:31:50.014633 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Mar 25 01:31:50.014645 kernel: Policy zone: DMA32 Mar 25 01:31:50.014656 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:31:50.014667 kernel: software IO TLB: area num 16. Mar 25 01:31:50.014678 kernel: Memory: 1897436K/2096616K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 198920K reserved, 0K cma-reserved) Mar 25 01:31:50.014689 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 25 01:31:50.014700 kernel: Kernel/User page tables isolation: enabled Mar 25 01:31:50.014711 kernel: ftrace: allocating 37985 entries in 149 pages Mar 25 01:31:50.014722 kernel: ftrace: allocated 149 pages with 4 groups Mar 25 01:31:50.014738 kernel: Dynamic Preempt: voluntary Mar 25 01:31:50.014750 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:31:50.014762 kernel: rcu: RCU event tracing is enabled. Mar 25 01:31:50.014773 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 25 01:31:50.014785 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:31:50.014808 kernel: Rude variant of Tasks RCU enabled. Mar 25 01:31:50.014824 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:31:50.014836 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:31:50.014848 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 25 01:31:50.014859 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 25 01:31:50.014871 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:31:50.014882 kernel: Console: colour VGA+ 80x25 Mar 25 01:31:50.017941 kernel: printk: console [tty0] enabled Mar 25 01:31:50.017961 kernel: printk: console [ttyS0] enabled Mar 25 01:31:50.017974 kernel: ACPI: Core revision 20230628 Mar 25 01:31:50.017986 kernel: APIC: Switch to symmetric I/O mode setup Mar 25 01:31:50.018048 kernel: x2apic enabled Mar 25 01:31:50.018086 kernel: APIC: Switched APIC routing to: physical x2apic Mar 25 01:31:50.018099 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 25 01:31:50.018111 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Mar 25 01:31:50.018123 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 25 01:31:50.018135 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 25 01:31:50.018147 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 25 01:31:50.018159 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 25 01:31:50.018170 kernel: Spectre V2 : Mitigation: Retpolines Mar 25 01:31:50.018182 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 25 01:31:50.018193 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 25 01:31:50.018210 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Mar 25 01:31:50.018222 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 25 01:31:50.018234 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 25 01:31:50.018245 kernel: MDS: Mitigation: Clear CPU buffers Mar 25 01:31:50.018257 kernel: MMIO Stale Data: Unknown: No mitigations Mar 25 01:31:50.018268 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 25 01:31:50.018280 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 25 01:31:50.018292 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 25 01:31:50.018303 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 25 01:31:50.018315 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 25 01:31:50.018326 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 25 01:31:50.018343 kernel: Freeing SMP alternatives memory: 32K Mar 25 01:31:50.018355 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:31:50.018367 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:31:50.018379 kernel: landlock: Up and running. Mar 25 01:31:50.018390 kernel: SELinux: Initializing. Mar 25 01:31:50.018402 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:31:50.018413 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:31:50.018425 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Mar 25 01:31:50.018437 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 01:31:50.018449 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 01:31:50.018461 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 01:31:50.018478 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Mar 25 01:31:50.018490 kernel: signal: max sigframe size: 1776 Mar 25 01:31:50.018502 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:31:50.018514 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:31:50.018526 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 25 01:31:50.018538 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:31:50.018549 kernel: smpboot: x86: Booting SMP configuration: Mar 25 01:31:50.018561 kernel: .... node #0, CPUs: #1 Mar 25 01:31:50.018573 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 25 01:31:50.018590 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:31:50.018602 kernel: smpboot: Max logical packages: 16 Mar 25 01:31:50.018614 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Mar 25 01:31:50.018625 kernel: devtmpfs: initialized Mar 25 01:31:50.018637 kernel: x86/mm: Memory block size: 128MB Mar 25 01:31:50.018649 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:31:50.018661 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 25 01:31:50.018672 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:31:50.018684 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:31:50.018701 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:31:50.018713 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:31:50.018725 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 25 01:31:50.018737 kernel: audit: type=2000 audit(1742866309.085:1): state=initialized audit_enabled=0 res=1 Mar 25 01:31:50.018748 kernel: cpuidle: using governor menu Mar 25 01:31:50.018760 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:31:50.018772 kernel: dca service started, version 1.12.1 Mar 25 01:31:50.018784 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 25 01:31:50.018796 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 25 01:31:50.018813 kernel: PCI: Using configuration type 1 for base access Mar 25 01:31:50.018825 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 25 01:31:50.018837 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:31:50.018849 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:31:50.018860 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:31:50.018872 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:31:50.018884 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:31:50.018910 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:31:50.020940 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:31:50.020963 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:31:50.020976 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:31:50.021000 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 25 01:31:50.021014 kernel: ACPI: Interpreter enabled Mar 25 01:31:50.021026 kernel: ACPI: PM: (supports S0 S5) Mar 25 01:31:50.021038 kernel: ACPI: Using IOAPIC for interrupt routing Mar 25 01:31:50.021050 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 25 01:31:50.021062 kernel: PCI: Using E820 reservations for host bridge windows Mar 25 01:31:50.021074 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 25 01:31:50.021092 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:31:50.021370 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:31:50.021551 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:31:50.021723 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:31:50.021741 kernel: PCI host bridge to bus 0000:00 Mar 25 01:31:50.024800 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 25 01:31:50.025037 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 25 01:31:50.025195 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 25 01:31:50.025347 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 25 01:31:50.025499 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 25 01:31:50.025651 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 25 01:31:50.025803 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:31:50.026041 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 25 01:31:50.026234 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Mar 25 01:31:50.026405 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Mar 25 01:31:50.026573 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Mar 25 01:31:50.026740 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Mar 25 01:31:50.028980 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 25 01:31:50.029198 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.029374 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Mar 25 01:31:50.029562 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.029732 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Mar 25 01:31:50.031868 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.032094 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Mar 25 01:31:50.032278 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.032450 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Mar 25 01:31:50.032638 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.032817 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Mar 25 01:31:50.033090 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.033264 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Mar 25 01:31:50.033463 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.033642 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Mar 25 01:31:50.033827 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 25 01:31:50.034032 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Mar 25 01:31:50.034213 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 25 01:31:50.034388 kernel: pci 0000:00:03.0: reg 0x10: [io 0xd0c0-0xd0df] Mar 25 01:31:50.034567 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Mar 25 01:31:50.034745 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 25 01:31:50.040129 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Mar 25 01:31:50.040327 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 25 01:31:50.040506 kernel: pci 0000:00:04.0: reg 0x10: [io 0xd000-0xd07f] Mar 25 01:31:50.040680 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Mar 25 01:31:50.040852 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Mar 25 01:31:50.041092 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 25 01:31:50.041261 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 25 01:31:50.041446 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 25 01:31:50.041614 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xd0e0-0xd0ff] Mar 25 01:31:50.041790 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Mar 25 01:31:50.045333 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 25 01:31:50.045545 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 25 01:31:50.045759 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Mar 25 01:31:50.045969 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Mar 25 01:31:50.046180 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 25 01:31:50.046349 kernel: pci 0000:00:02.0: bridge window [io 0xc000-0xcfff] Mar 25 01:31:50.046514 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 25 01:31:50.046679 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:31:50.046876 kernel: pci_bus 0000:02: extended config space not accessible Mar 25 01:31:50.050878 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Mar 25 01:31:50.051142 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Mar 25 01:31:50.051320 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 25 01:31:50.051492 kernel: pci 0000:01:00.0: bridge window [io 0xc000-0xcfff] Mar 25 01:31:50.051684 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:31:50.051863 kernel: pci 0000:01:00.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:31:50.052089 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 25 01:31:50.052269 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Mar 25 01:31:50.052439 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 25 01:31:50.052616 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:31:50.052777 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:31:50.054073 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 25 01:31:50.054256 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 25 01:31:50.054426 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 25 01:31:50.054596 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:31:50.054782 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:31:50.056074 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 25 01:31:50.056245 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:31:50.056434 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:31:50.056602 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 25 01:31:50.056777 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:31:50.058177 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:31:50.058363 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 25 01:31:50.058549 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:31:50.058710 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:31:50.058893 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 25 01:31:50.060124 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:31:50.060305 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:31:50.060482 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 25 01:31:50.060649 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:31:50.060834 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:31:50.060870 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 25 01:31:50.060882 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 25 01:31:50.060893 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 25 01:31:50.062920 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 25 01:31:50.062962 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 25 01:31:50.062977 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 25 01:31:50.062999 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 25 01:31:50.063012 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 25 01:31:50.063024 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 25 01:31:50.063044 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 25 01:31:50.063056 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 25 01:31:50.063068 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 25 01:31:50.063080 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 25 01:31:50.063092 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 25 01:31:50.063103 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 25 01:31:50.063115 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 25 01:31:50.063127 kernel: iommu: Default domain type: Translated Mar 25 01:31:50.063138 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 25 01:31:50.063156 kernel: PCI: Using ACPI for IRQ routing Mar 25 01:31:50.063168 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 25 01:31:50.063179 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 25 01:31:50.063191 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 25 01:31:50.063371 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 25 01:31:50.063541 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 25 01:31:50.063705 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 25 01:31:50.063724 kernel: vgaarb: loaded Mar 25 01:31:50.063742 kernel: clocksource: Switched to clocksource kvm-clock Mar 25 01:31:50.063755 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:31:50.063767 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:31:50.063779 kernel: pnp: PnP ACPI init Mar 25 01:31:50.063984 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 25 01:31:50.064017 kernel: pnp: PnP ACPI: found 5 devices Mar 25 01:31:50.064030 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 25 01:31:50.064041 kernel: NET: Registered PF_INET protocol family Mar 25 01:31:50.064060 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:31:50.064073 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 25 01:31:50.064085 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:31:50.064097 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 25 01:31:50.064109 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 25 01:31:50.064121 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 25 01:31:50.064132 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:31:50.064144 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:31:50.064156 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:31:50.064173 kernel: NET: Registered PF_XDP protocol family Mar 25 01:31:50.064338 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 25 01:31:50.064535 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 25 01:31:50.064711 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 25 01:31:50.064888 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 25 01:31:50.067145 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 25 01:31:50.067329 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 25 01:31:50.067501 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 25 01:31:50.067669 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x1000-0x1fff] Mar 25 01:31:50.067835 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x2000-0x2fff] Mar 25 01:31:50.068052 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x3000-0x3fff] Mar 25 01:31:50.068220 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x4000-0x4fff] Mar 25 01:31:50.068386 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x5000-0x5fff] Mar 25 01:31:50.068552 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x6000-0x6fff] Mar 25 01:31:50.068775 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x7000-0x7fff] Mar 25 01:31:50.070014 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 25 01:31:50.070199 kernel: pci 0000:01:00.0: bridge window [io 0xc000-0xcfff] Mar 25 01:31:50.070406 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:31:50.070580 kernel: pci 0000:01:00.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:31:50.070751 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 25 01:31:50.070952 kernel: pci 0000:00:02.0: bridge window [io 0xc000-0xcfff] Mar 25 01:31:50.071136 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 25 01:31:50.071313 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:31:50.071493 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 25 01:31:50.071668 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x1fff] Mar 25 01:31:50.071848 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:31:50.074114 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:31:50.074299 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 25 01:31:50.074490 kernel: pci 0000:00:02.2: bridge window [io 0x2000-0x2fff] Mar 25 01:31:50.074686 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:31:50.074863 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:31:50.075058 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 25 01:31:50.075224 kernel: pci 0000:00:02.3: bridge window [io 0x3000-0x3fff] Mar 25 01:31:50.075403 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:31:50.075570 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:31:50.075740 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 25 01:31:50.078011 kernel: pci 0000:00:02.4: bridge window [io 0x4000-0x4fff] Mar 25 01:31:50.078216 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:31:50.078388 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:31:50.078568 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 25 01:31:50.078755 kernel: pci 0000:00:02.5: bridge window [io 0x5000-0x5fff] Mar 25 01:31:50.078929 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:31:50.079130 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:31:50.079299 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 25 01:31:50.079471 kernel: pci 0000:00:02.6: bridge window [io 0x6000-0x6fff] Mar 25 01:31:50.079645 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:31:50.079813 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:31:50.084999 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 25 01:31:50.085178 kernel: pci 0000:00:02.7: bridge window [io 0x7000-0x7fff] Mar 25 01:31:50.085346 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:31:50.085512 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:31:50.085670 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 25 01:31:50.085831 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 25 01:31:50.086010 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 25 01:31:50.086163 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 25 01:31:50.086311 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 25 01:31:50.086460 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 25 01:31:50.086654 kernel: pci_bus 0000:01: resource 0 [io 0xc000-0xcfff] Mar 25 01:31:50.086814 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 25 01:31:50.087031 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:31:50.087214 kernel: pci_bus 0000:02: resource 0 [io 0xc000-0xcfff] Mar 25 01:31:50.087392 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 25 01:31:50.087567 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:31:50.087748 kernel: pci_bus 0000:03: resource 0 [io 0x1000-0x1fff] Mar 25 01:31:50.087933 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 25 01:31:50.088126 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:31:50.088330 kernel: pci_bus 0000:04: resource 0 [io 0x2000-0x2fff] Mar 25 01:31:50.088510 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 25 01:31:50.088679 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:31:50.088859 kernel: pci_bus 0000:05: resource 0 [io 0x3000-0x3fff] Mar 25 01:31:50.093147 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 25 01:31:50.093315 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:31:50.093486 kernel: pci_bus 0000:06: resource 0 [io 0x4000-0x4fff] Mar 25 01:31:50.093655 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 25 01:31:50.093811 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:31:50.094049 kernel: pci_bus 0000:07: resource 0 [io 0x5000-0x5fff] Mar 25 01:31:50.094209 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 25 01:31:50.094364 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:31:50.094529 kernel: pci_bus 0000:08: resource 0 [io 0x6000-0x6fff] Mar 25 01:31:50.094694 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 25 01:31:50.094850 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:31:50.095050 kernel: pci_bus 0000:09: resource 0 [io 0x7000-0x7fff] Mar 25 01:31:50.095208 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 25 01:31:50.095364 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:31:50.095391 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 25 01:31:50.095409 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:31:50.095423 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 25 01:31:50.095435 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 25 01:31:50.095448 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 25 01:31:50.095461 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 25 01:31:50.095473 kernel: Initialise system trusted keyrings Mar 25 01:31:50.095486 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 25 01:31:50.095511 kernel: Key type asymmetric registered Mar 25 01:31:50.095522 kernel: Asymmetric key parser 'x509' registered Mar 25 01:31:50.095539 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 25 01:31:50.095552 kernel: io scheduler mq-deadline registered Mar 25 01:31:50.095564 kernel: io scheduler kyber registered Mar 25 01:31:50.095576 kernel: io scheduler bfq registered Mar 25 01:31:50.095761 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 25 01:31:50.095916 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 25 01:31:50.096122 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.096304 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 25 01:31:50.096467 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 25 01:31:50.096665 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.096848 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 25 01:31:50.101628 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 25 01:31:50.101808 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.102015 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 25 01:31:50.102188 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 25 01:31:50.102365 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.102542 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 25 01:31:50.102710 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 25 01:31:50.102875 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.103084 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 25 01:31:50.103252 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 25 01:31:50.103440 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.103619 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 25 01:31:50.103786 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 25 01:31:50.103984 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.104168 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 25 01:31:50.104336 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 25 01:31:50.104511 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:31:50.104532 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 25 01:31:50.104546 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 25 01:31:50.104558 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 25 01:31:50.104571 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:31:50.104583 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 25 01:31:50.104596 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 25 01:31:50.104614 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 25 01:31:50.104627 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 25 01:31:50.104796 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 25 01:31:50.104816 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 25 01:31:50.110208 kernel: rtc_cmos 00:03: registered as rtc0 Mar 25 01:31:50.110371 kernel: rtc_cmos 00:03: setting system clock to 2025-03-25T01:31:49 UTC (1742866309) Mar 25 01:31:50.110526 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 25 01:31:50.110545 kernel: intel_pstate: CPU model not supported Mar 25 01:31:50.110566 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:31:50.110579 kernel: Segment Routing with IPv6 Mar 25 01:31:50.110592 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:31:50.110604 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:31:50.110616 kernel: Key type dns_resolver registered Mar 25 01:31:50.110629 kernel: IPI shorthand broadcast: enabled Mar 25 01:31:50.110641 kernel: sched_clock: Marking stable (1094003188, 225648060)->(1544701650, -225050402) Mar 25 01:31:50.110659 kernel: registered taskstats version 1 Mar 25 01:31:50.110672 kernel: Loading compiled-in X.509 certificates Mar 25 01:31:50.110689 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: eff01054e94a599f8e404b9a9482f4e2220f5386' Mar 25 01:31:50.110701 kernel: Key type .fscrypt registered Mar 25 01:31:50.110713 kernel: Key type fscrypt-provisioning registered Mar 25 01:31:50.110726 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:31:50.110738 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:31:50.110750 kernel: ima: No architecture policies found Mar 25 01:31:50.110762 kernel: clk: Disabling unused clocks Mar 25 01:31:50.110775 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 25 01:31:50.110787 kernel: Write protecting the kernel read-only data: 40960k Mar 25 01:31:50.110805 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 25 01:31:50.110817 kernel: Run /init as init process Mar 25 01:31:50.110829 kernel: with arguments: Mar 25 01:31:50.110842 kernel: /init Mar 25 01:31:50.110853 kernel: with environment: Mar 25 01:31:50.110865 kernel: HOME=/ Mar 25 01:31:50.110877 kernel: TERM=linux Mar 25 01:31:50.110902 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:31:50.110926 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:31:50.110963 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:31:50.110977 systemd[1]: Detected virtualization kvm. Mar 25 01:31:50.111024 systemd[1]: Detected architecture x86-64. Mar 25 01:31:50.111039 systemd[1]: Running in initrd. Mar 25 01:31:50.111052 systemd[1]: No hostname configured, using default hostname. Mar 25 01:31:50.111065 systemd[1]: Hostname set to . Mar 25 01:31:50.111078 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:31:50.111097 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:31:50.111111 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:31:50.111124 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:31:50.111138 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:31:50.111151 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:31:50.111164 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:31:50.111178 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:31:50.111198 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:31:50.111212 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:31:50.111225 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:31:50.111239 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:31:50.111252 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:31:50.111265 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:31:50.111278 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:31:50.111291 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:31:50.111310 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:31:50.111323 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:31:50.111336 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:31:50.111350 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:31:50.111363 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:31:50.111377 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:31:50.111390 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:31:50.111403 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:31:50.111417 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:31:50.111435 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:31:50.111449 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:31:50.111462 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:31:50.111475 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:31:50.111489 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:31:50.111502 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:31:50.111559 systemd-journald[202]: Collecting audit messages is disabled. Mar 25 01:31:50.111597 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:31:50.111623 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:31:50.111636 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:31:50.111655 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:31:50.111668 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:31:50.111681 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:31:50.111695 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:31:50.111709 systemd-journald[202]: Journal started Mar 25 01:31:50.111743 systemd-journald[202]: Runtime Journal (/run/log/journal/f02c46255e86425f9d87c1b1fd4543ab) is 4.7M, max 37.9M, 33.2M free. Mar 25 01:31:50.053520 systemd-modules-load[203]: Inserted module 'overlay' Mar 25 01:31:50.117931 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:31:50.120998 systemd-modules-load[203]: Inserted module 'br_netfilter' Mar 25 01:31:50.121985 kernel: Bridge firewalling registered Mar 25 01:31:50.124957 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:31:50.128928 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:31:50.133040 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:31:50.141117 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:31:50.144043 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:31:50.146502 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:31:50.154711 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:31:50.158121 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:31:50.167073 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:31:50.168154 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:31:50.174134 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:31:50.187905 dracut-cmdline[233]: dracut-dracut-053 Mar 25 01:31:50.191954 dracut-cmdline[233]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:31:50.229504 systemd-resolved[238]: Positive Trust Anchors: Mar 25 01:31:50.229545 systemd-resolved[238]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:31:50.229586 systemd-resolved[238]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:31:50.233698 systemd-resolved[238]: Defaulting to hostname 'linux'. Mar 25 01:31:50.235417 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:31:50.239031 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:31:50.304924 kernel: SCSI subsystem initialized Mar 25 01:31:50.316093 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:31:50.328934 kernel: iscsi: registered transport (tcp) Mar 25 01:31:50.354218 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:31:50.354285 kernel: QLogic iSCSI HBA Driver Mar 25 01:31:50.405352 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:31:50.407787 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:31:50.447932 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:31:50.448193 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:31:50.450559 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:31:50.497958 kernel: raid6: sse2x4 gen() 14280 MB/s Mar 25 01:31:50.515991 kernel: raid6: sse2x2 gen() 9578 MB/s Mar 25 01:31:50.534483 kernel: raid6: sse2x1 gen() 9805 MB/s Mar 25 01:31:50.534576 kernel: raid6: using algorithm sse2x4 gen() 14280 MB/s Mar 25 01:31:50.553532 kernel: raid6: .... xor() 8047 MB/s, rmw enabled Mar 25 01:31:50.553621 kernel: raid6: using ssse3x2 recovery algorithm Mar 25 01:31:50.577990 kernel: xor: automatically using best checksumming function avx Mar 25 01:31:50.738991 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:31:50.753023 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:31:50.756416 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:31:50.788469 systemd-udevd[421]: Using default interface naming scheme 'v255'. Mar 25 01:31:50.797219 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:31:50.800711 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:31:50.833251 dracut-pre-trigger[427]: rd.md=0: removing MD RAID activation Mar 25 01:31:50.871310 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:31:50.873819 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:31:50.993665 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:31:50.999302 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:31:51.034652 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:31:51.037756 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:31:51.038880 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:31:51.042244 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:31:51.045295 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:31:51.073913 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:31:51.120931 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 25 01:31:51.175659 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 25 01:31:51.175919 kernel: cryptd: max_cpu_qlen set to 1000 Mar 25 01:31:51.175953 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:31:51.175983 kernel: GPT:17805311 != 125829119 Mar 25 01:31:51.176002 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:31:51.176029 kernel: GPT:17805311 != 125829119 Mar 25 01:31:51.176045 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:31:51.176061 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:31:51.176077 kernel: AVX version of gcm_enc/dec engaged. Mar 25 01:31:51.176093 kernel: AES CTR mode by8 optimization enabled Mar 25 01:31:51.187501 kernel: ACPI: bus type USB registered Mar 25 01:31:51.187548 kernel: usbcore: registered new interface driver usbfs Mar 25 01:31:51.191937 kernel: usbcore: registered new interface driver hub Mar 25 01:31:51.197914 kernel: usbcore: registered new device driver usb Mar 25 01:31:51.198544 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:31:51.198721 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:31:51.205799 kernel: libata version 3.00 loaded. Mar 25 01:31:51.203230 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:31:51.204177 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:31:51.204361 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:31:51.207033 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:31:51.216850 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:31:51.222469 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:31:51.226406 kernel: ahci 0000:00:1f.2: version 3.0 Mar 25 01:31:51.288647 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 25 01:31:51.288696 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 25 01:31:51.289208 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 25 01:31:51.289444 kernel: scsi host0: ahci Mar 25 01:31:51.289693 kernel: scsi host1: ahci Mar 25 01:31:51.290115 kernel: scsi host2: ahci Mar 25 01:31:51.290381 kernel: scsi host3: ahci Mar 25 01:31:51.290620 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (471) Mar 25 01:31:51.290650 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 25 01:31:51.319136 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 25 01:31:51.319427 kernel: scsi host4: ahci Mar 25 01:31:51.319688 kernel: scsi host5: ahci Mar 25 01:31:51.319995 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Mar 25 01:31:51.320019 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Mar 25 01:31:51.320050 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Mar 25 01:31:51.320068 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Mar 25 01:31:51.320086 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Mar 25 01:31:51.320104 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Mar 25 01:31:51.320120 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 25 01:31:51.320342 kernel: BTRFS: device fsid 6d9424cd-1432-492b-b006-b311869817e2 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (465) Mar 25 01:31:51.320374 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 25 01:31:51.320617 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 25 01:31:51.320895 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 25 01:31:51.323222 kernel: hub 1-0:1.0: USB hub found Mar 25 01:31:51.323484 kernel: hub 1-0:1.0: 4 ports detected Mar 25 01:31:51.323730 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 25 01:31:51.324969 kernel: hub 2-0:1.0: USB hub found Mar 25 01:31:51.325220 kernel: hub 2-0:1.0: 4 ports detected Mar 25 01:31:51.331379 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 25 01:31:51.395093 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:31:51.423626 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 25 01:31:51.435811 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:31:51.446304 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 25 01:31:51.447111 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 25 01:31:51.451094 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:31:51.455063 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:31:51.467883 disk-uuid[563]: Primary Header is updated. Mar 25 01:31:51.467883 disk-uuid[563]: Secondary Entries is updated. Mar 25 01:31:51.467883 disk-uuid[563]: Secondary Header is updated. Mar 25 01:31:51.475926 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:31:51.491538 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:31:51.556977 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 25 01:31:51.599882 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 25 01:31:51.600007 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 25 01:31:51.605975 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 25 01:31:51.606011 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 25 01:31:51.606038 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 25 01:31:51.606055 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 25 01:31:51.703948 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:31:51.711056 kernel: usbcore: registered new interface driver usbhid Mar 25 01:31:51.711093 kernel: usbhid: USB HID core driver Mar 25 01:31:51.720909 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 25 01:31:51.720947 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 25 01:31:52.493266 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:31:52.494108 disk-uuid[564]: The operation has completed successfully. Mar 25 01:31:52.553316 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:31:52.553483 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:31:52.596100 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:31:52.613005 sh[585]: Success Mar 25 01:31:52.629990 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Mar 25 01:31:52.708737 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:31:52.715004 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:31:52.716722 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:31:52.743239 kernel: BTRFS info (device dm-0): first mount of filesystem 6d9424cd-1432-492b-b006-b311869817e2 Mar 25 01:31:52.743294 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:31:52.745533 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:31:52.748704 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:31:52.748742 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:31:52.758688 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:31:52.760101 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:31:52.763097 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:31:52.764654 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:31:52.795961 kernel: BTRFS info (device vda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:31:52.799518 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:31:52.799562 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:31:52.804913 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:31:52.811958 kernel: BTRFS info (device vda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:31:52.814536 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:31:52.818104 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:31:52.916223 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:31:52.921135 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:31:52.967316 ignition[689]: Ignition 2.20.0 Mar 25 01:31:52.967334 ignition[689]: Stage: fetch-offline Mar 25 01:31:52.967421 ignition[689]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:31:52.967454 ignition[689]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:31:52.967641 ignition[689]: parsed url from cmdline: "" Mar 25 01:31:52.967648 ignition[689]: no config URL provided Mar 25 01:31:52.967657 ignition[689]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:31:52.972093 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:31:52.967673 ignition[689]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:31:52.974429 systemd-networkd[766]: lo: Link UP Mar 25 01:31:52.967682 ignition[689]: failed to fetch config: resource requires networking Mar 25 01:31:52.974435 systemd-networkd[766]: lo: Gained carrier Mar 25 01:31:52.967976 ignition[689]: Ignition finished successfully Mar 25 01:31:52.976769 systemd-networkd[766]: Enumeration completed Mar 25 01:31:52.977024 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:31:52.977339 systemd-networkd[766]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:31:52.977346 systemd-networkd[766]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:31:52.979199 systemd[1]: Reached target network.target - Network. Mar 25 01:31:52.980485 systemd-networkd[766]: eth0: Link UP Mar 25 01:31:52.980493 systemd-networkd[766]: eth0: Gained carrier Mar 25 01:31:52.980511 systemd-networkd[766]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:31:52.983177 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:31:52.998074 systemd-networkd[766]: eth0: DHCPv4 address 10.243.75.178/30, gateway 10.243.75.177 acquired from 10.243.75.177 Mar 25 01:31:53.014655 ignition[775]: Ignition 2.20.0 Mar 25 01:31:53.014677 ignition[775]: Stage: fetch Mar 25 01:31:53.014934 ignition[775]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:31:53.014957 ignition[775]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:31:53.015084 ignition[775]: parsed url from cmdline: "" Mar 25 01:31:53.015092 ignition[775]: no config URL provided Mar 25 01:31:53.015101 ignition[775]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:31:53.015117 ignition[775]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:31:53.015325 ignition[775]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 25 01:31:53.016656 ignition[775]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 25 01:31:53.016685 ignition[775]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 25 01:31:53.034110 ignition[775]: GET result: OK Mar 25 01:31:53.035118 ignition[775]: parsing config with SHA512: 27810dcb1469006679d79cb545d87ce4b100ede79a5286f1ac996db0e8330d5a3a21f141d8108e65ec28d96c2cce02d25ffc55fe0f8f5556ed374124b4a4c824 Mar 25 01:31:53.042878 unknown[775]: fetched base config from "system" Mar 25 01:31:53.043299 ignition[775]: fetch: fetch complete Mar 25 01:31:53.042915 unknown[775]: fetched base config from "system" Mar 25 01:31:53.043307 ignition[775]: fetch: fetch passed Mar 25 01:31:53.042936 unknown[775]: fetched user config from "openstack" Mar 25 01:31:53.043369 ignition[775]: Ignition finished successfully Mar 25 01:31:53.046278 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:31:53.050295 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:31:53.076954 ignition[782]: Ignition 2.20.0 Mar 25 01:31:53.076974 ignition[782]: Stage: kargs Mar 25 01:31:53.077183 ignition[782]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:31:53.079461 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:31:53.077203 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:31:53.078214 ignition[782]: kargs: kargs passed Mar 25 01:31:53.078284 ignition[782]: Ignition finished successfully Mar 25 01:31:53.083098 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:31:53.109217 ignition[789]: Ignition 2.20.0 Mar 25 01:31:53.109235 ignition[789]: Stage: disks Mar 25 01:31:53.109442 ignition[789]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:31:53.109460 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:31:53.113745 ignition[789]: disks: disks passed Mar 25 01:31:53.114426 ignition[789]: Ignition finished successfully Mar 25 01:31:53.115593 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:31:53.117108 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:31:53.117841 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:31:53.119449 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:31:53.121005 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:31:53.122239 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:31:53.124820 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:31:53.155581 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 25 01:31:53.158525 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:31:53.162016 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:31:53.273921 kernel: EXT4-fs (vda9): mounted filesystem 4e6dca82-2e50-453c-be25-61f944b72008 r/w with ordered data mode. Quota mode: none. Mar 25 01:31:53.275114 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:31:53.276323 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:31:53.288287 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:31:53.291995 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:31:53.294417 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:31:53.304069 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 25 01:31:53.305801 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:31:53.309425 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:31:53.314971 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (805) Mar 25 01:31:53.315022 kernel: BTRFS info (device vda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:31:53.315253 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:31:53.320876 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:31:53.320939 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:31:53.324936 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:31:53.329105 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:31:53.331514 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:31:53.410250 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:31:53.421029 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:31:53.427051 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:31:53.432638 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:31:53.546597 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:31:53.549980 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:31:53.554075 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:31:53.571032 kernel: BTRFS info (device vda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:31:53.598720 ignition[924]: INFO : Ignition 2.20.0 Mar 25 01:31:53.599797 ignition[924]: INFO : Stage: mount Mar 25 01:31:53.601795 ignition[924]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:31:53.601795 ignition[924]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:31:53.601795 ignition[924]: INFO : mount: mount passed Mar 25 01:31:53.601795 ignition[924]: INFO : Ignition finished successfully Mar 25 01:31:53.604853 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:31:53.606194 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:31:53.741994 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:31:54.788535 systemd-networkd[766]: eth0: Gained IPv6LL Mar 25 01:31:56.296361 systemd-networkd[766]: eth0: Ignoring DHCPv6 address 2a02:1348:17c:d2ec:24:19ff:fef3:4bb2/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17c:d2ec:24:19ff:fef3:4bb2/64 assigned by NDisc. Mar 25 01:31:56.296381 systemd-networkd[766]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 25 01:32:00.459134 coreos-metadata[807]: Mar 25 01:32:00.459 WARN failed to locate config-drive, using the metadata service API instead Mar 25 01:32:00.483028 coreos-metadata[807]: Mar 25 01:32:00.482 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 25 01:32:00.500071 coreos-metadata[807]: Mar 25 01:32:00.499 INFO Fetch successful Mar 25 01:32:00.500972 coreos-metadata[807]: Mar 25 01:32:00.500 INFO wrote hostname srv-y0b1r.gb1.brightbox.com to /sysroot/etc/hostname Mar 25 01:32:00.502855 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 25 01:32:00.503070 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 25 01:32:00.507005 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:32:00.527287 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:32:00.554995 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (940) Mar 25 01:32:00.556959 kernel: BTRFS info (device vda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:32:00.558378 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:32:00.560165 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:32:00.565932 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:32:00.569028 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:32:00.597094 ignition[957]: INFO : Ignition 2.20.0 Mar 25 01:32:00.597094 ignition[957]: INFO : Stage: files Mar 25 01:32:00.598921 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:32:00.598921 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:32:00.598921 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:32:00.601821 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:32:00.601821 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:32:00.603860 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:32:00.604818 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:32:00.604818 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:32:00.604616 unknown[957]: wrote ssh authorized keys file for user: core Mar 25 01:32:00.608127 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 25 01:32:00.608127 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 25 01:32:02.445124 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:32:08.486545 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 25 01:32:08.488321 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:32:08.489359 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:32:08.496868 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:32:08.496868 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:32:08.496868 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:32:08.496868 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:32:08.496868 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:32:08.496868 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 25 01:32:09.154703 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:32:12.158140 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:32:12.158140 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:32:12.162377 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:32:12.162377 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:32:12.162377 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:32:12.162377 ignition[957]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:32:12.162377 ignition[957]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:32:12.162377 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:32:12.162377 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:32:12.162377 ignition[957]: INFO : files: files passed Mar 25 01:32:12.162377 ignition[957]: INFO : Ignition finished successfully Mar 25 01:32:12.163226 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:32:12.169142 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:32:12.174206 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:32:12.195611 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:32:12.195790 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:32:12.204913 initrd-setup-root-after-ignition[988]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:32:12.206377 initrd-setup-root-after-ignition[988]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:32:12.207538 initrd-setup-root-after-ignition[992]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:32:12.208447 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:32:12.210111 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:32:12.212072 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:32:12.271689 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:32:12.271873 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:32:12.273548 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:32:12.274735 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:32:12.276262 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:32:12.277952 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:32:12.304862 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:32:12.307310 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:32:12.326964 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:32:12.328767 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:32:12.329651 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:32:12.331167 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:32:12.331361 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:32:12.334605 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:32:12.335510 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:32:12.336813 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:32:12.338071 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:32:12.339539 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:32:12.340967 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:32:12.343222 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:32:12.344238 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:32:12.345794 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:32:12.347158 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:32:12.348334 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:32:12.348620 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:32:12.350038 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:32:12.351052 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:32:12.352445 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:32:12.352663 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:32:12.354048 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:32:12.354286 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:32:12.355966 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:32:12.356129 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:32:12.357884 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:32:12.358062 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:32:12.362141 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:32:12.363446 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:32:12.363689 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:32:12.367304 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:32:12.370419 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:32:12.370609 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:32:12.377578 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:32:12.377731 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:32:12.388673 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:32:12.390126 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:32:12.394912 ignition[1012]: INFO : Ignition 2.20.0 Mar 25 01:32:12.394912 ignition[1012]: INFO : Stage: umount Mar 25 01:32:12.394912 ignition[1012]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:32:12.394912 ignition[1012]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:32:12.399097 ignition[1012]: INFO : umount: umount passed Mar 25 01:32:12.399097 ignition[1012]: INFO : Ignition finished successfully Mar 25 01:32:12.396798 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:32:12.396941 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:32:12.398609 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:32:12.398750 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:32:12.399734 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:32:12.399813 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:32:12.407634 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:32:12.407718 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:32:12.408856 systemd[1]: Stopped target network.target - Network. Mar 25 01:32:12.410055 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:32:12.410140 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:32:12.411433 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:32:12.414256 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:32:12.416270 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:32:12.417216 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:32:12.420725 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:32:12.422068 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:32:12.422145 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:32:12.423244 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:32:12.423305 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:32:12.424498 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:32:12.424569 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:32:12.425743 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:32:12.425806 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:32:12.427183 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:32:12.429179 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:32:12.432105 systemd-networkd[766]: eth0: DHCPv6 lease lost Mar 25 01:32:12.433437 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:32:12.434339 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:32:12.434515 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:32:12.437333 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:32:12.437973 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:32:12.444033 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:32:12.444467 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:32:12.444628 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:32:12.447036 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:32:12.449050 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:32:12.449174 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:32:12.450665 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:32:12.450740 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:32:12.454049 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:32:12.454705 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:32:12.454779 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:32:12.456315 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:32:12.456383 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:32:12.457169 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:32:12.457244 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:32:12.457923 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:32:12.457988 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:32:12.460056 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:32:12.463572 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:32:12.463667 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:32:12.471231 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:32:12.472961 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:32:12.476096 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:32:12.476177 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:32:12.477762 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:32:12.477823 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:32:12.479132 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:32:12.479203 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:32:12.482638 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:32:12.482709 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:32:12.485071 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:32:12.485150 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:32:12.488069 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:32:12.488972 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:32:12.489046 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:32:12.491398 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:32:12.491491 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:32:12.493620 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:32:12.493712 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:32:12.494323 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:32:12.494494 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:32:12.513103 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:32:12.513265 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:32:12.514989 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:32:12.518085 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:32:12.537693 systemd[1]: Switching root. Mar 25 01:32:12.567767 systemd-journald[202]: Journal stopped Mar 25 01:32:14.292464 systemd-journald[202]: Received SIGTERM from PID 1 (systemd). Mar 25 01:32:14.292568 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:32:14.292612 kernel: SELinux: policy capability open_perms=1 Mar 25 01:32:14.292641 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:32:14.292667 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:32:14.292694 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:32:14.292723 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:32:14.292752 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:32:14.292788 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:32:14.292816 kernel: audit: type=1403 audit(1742866332.949:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:32:14.292860 systemd[1]: Successfully loaded SELinux policy in 56.772ms. Mar 25 01:32:14.292927 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 23.582ms. Mar 25 01:32:14.292960 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:32:14.292988 systemd[1]: Detected virtualization kvm. Mar 25 01:32:14.293010 systemd[1]: Detected architecture x86-64. Mar 25 01:32:14.293037 systemd[1]: Detected first boot. Mar 25 01:32:14.293059 systemd[1]: Hostname set to . Mar 25 01:32:14.293079 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:32:14.293114 zram_generator::config[1057]: No configuration found. Mar 25 01:32:14.293147 kernel: Guest personality initialized and is inactive Mar 25 01:32:14.293173 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 25 01:32:14.293193 kernel: Initialized host personality Mar 25 01:32:14.293218 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:32:14.293244 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:32:14.293268 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:32:14.293289 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:32:14.293315 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:32:14.293348 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:32:14.293381 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:32:14.293403 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:32:14.293435 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:32:14.293463 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:32:14.293485 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:32:14.293508 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:32:14.293534 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:32:14.293570 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:32:14.293593 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:32:14.293614 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:32:14.293635 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:32:14.293655 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:32:14.293677 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:32:14.293717 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:32:14.293772 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:32:14.293805 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:32:14.293826 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:32:14.293847 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:32:14.293871 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:32:14.293935 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:32:14.293966 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:32:14.293996 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:32:14.294022 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:32:14.294044 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:32:14.294074 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:32:14.294094 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:32:14.294115 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:32:14.294135 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:32:14.294194 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:32:14.294218 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:32:14.294239 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:32:14.294267 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:32:14.294300 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:32:14.294319 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:32:14.294346 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:32:14.294366 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:32:14.294400 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:32:14.294451 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:32:14.294475 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:32:14.294503 systemd[1]: Reached target machines.target - Containers. Mar 25 01:32:14.294524 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:32:14.294545 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:32:14.294572 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:32:14.294594 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:32:14.294615 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:32:14.294649 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:32:14.294677 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:32:14.294699 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:32:14.294720 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:32:14.294741 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:32:14.294769 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:32:14.294798 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:32:14.294819 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:32:14.294840 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:32:14.294874 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:32:14.296937 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:32:14.296966 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:32:14.296989 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:32:14.297010 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:32:14.297047 kernel: fuse: init (API version 7.39) Mar 25 01:32:14.297070 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:32:14.297091 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:32:14.297112 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:32:14.297132 systemd[1]: Stopped verity-setup.service. Mar 25 01:32:14.297153 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:32:14.297174 kernel: loop: module loaded Mar 25 01:32:14.297194 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:32:14.297235 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:32:14.297277 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:32:14.297300 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:32:14.297325 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:32:14.297358 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:32:14.297388 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:32:14.297409 kernel: ACPI: bus type drm_connector registered Mar 25 01:32:14.297443 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:32:14.297464 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:32:14.297485 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:32:14.297505 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:32:14.297534 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:32:14.297555 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:32:14.297589 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:32:14.297611 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:32:14.297632 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:32:14.297653 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:32:14.297674 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:32:14.297696 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:32:14.297716 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:32:14.297737 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:32:14.297764 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:32:14.297834 systemd-journald[1154]: Collecting audit messages is disabled. Mar 25 01:32:14.297881 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:32:14.297943 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:32:14.297967 systemd-journald[1154]: Journal started Mar 25 01:32:14.297998 systemd-journald[1154]: Runtime Journal (/run/log/journal/f02c46255e86425f9d87c1b1fd4543ab) is 4.7M, max 37.9M, 33.2M free. Mar 25 01:32:13.859972 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:32:13.875202 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 25 01:32:13.875965 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:32:14.304001 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:32:14.320137 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:32:14.329089 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:32:14.335981 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:32:14.338028 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:32:14.338092 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:32:14.341404 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:32:14.350095 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:32:14.353204 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:32:14.354120 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:32:14.358141 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:32:14.362168 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:32:14.364013 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:32:14.367011 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:32:14.367796 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:32:14.376078 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:32:14.381221 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:32:14.391162 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:32:14.397001 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:32:14.397956 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:32:14.399480 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:32:14.427147 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:32:14.428709 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:32:14.433652 systemd-journald[1154]: Time spent on flushing to /var/log/journal/f02c46255e86425f9d87c1b1fd4543ab is 93.801ms for 1160 entries. Mar 25 01:32:14.433652 systemd-journald[1154]: System Journal (/var/log/journal/f02c46255e86425f9d87c1b1fd4543ab) is 8M, max 584.8M, 576.8M free. Mar 25 01:32:14.574071 systemd-journald[1154]: Received client request to flush runtime journal. Mar 25 01:32:14.574879 kernel: loop0: detected capacity change from 0 to 8 Mar 25 01:32:14.574973 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:32:14.575007 kernel: loop1: detected capacity change from 0 to 151640 Mar 25 01:32:14.440110 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:32:14.505791 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:32:14.536595 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:32:14.569752 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:32:14.575298 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:32:14.577466 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:32:14.606942 kernel: loop2: detected capacity change from 0 to 210664 Mar 25 01:32:14.659775 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:32:14.663947 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:32:14.679648 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Mar 25 01:32:14.679675 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Mar 25 01:32:14.683933 kernel: loop3: detected capacity change from 0 to 109808 Mar 25 01:32:14.702896 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:32:14.726435 udevadm[1218]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 25 01:32:14.734945 kernel: loop4: detected capacity change from 0 to 8 Mar 25 01:32:14.744928 kernel: loop5: detected capacity change from 0 to 151640 Mar 25 01:32:14.777079 kernel: loop6: detected capacity change from 0 to 210664 Mar 25 01:32:14.814775 kernel: loop7: detected capacity change from 0 to 109808 Mar 25 01:32:14.829131 (sd-merge)[1221]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 25 01:32:14.830074 (sd-merge)[1221]: Merged extensions into '/usr'. Mar 25 01:32:14.839114 systemd[1]: Reload requested from client PID 1196 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:32:14.839136 systemd[1]: Reloading... Mar 25 01:32:14.996193 zram_generator::config[1247]: No configuration found. Mar 25 01:32:15.131123 ldconfig[1191]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:32:15.290202 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:32:15.385201 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:32:15.386281 systemd[1]: Reloading finished in 546 ms. Mar 25 01:32:15.414778 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:32:15.416141 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:32:15.434105 systemd[1]: Starting ensure-sysext.service... Mar 25 01:32:15.439960 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:32:15.468705 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:32:15.477378 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:32:15.478358 systemd[1]: Reload requested from client PID 1305 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:32:15.478410 systemd[1]: Reloading... Mar 25 01:32:15.512909 systemd-tmpfiles[1306]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:32:15.516237 systemd-tmpfiles[1306]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:32:15.517721 systemd-tmpfiles[1306]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:32:15.518147 systemd-tmpfiles[1306]: ACLs are not supported, ignoring. Mar 25 01:32:15.518267 systemd-tmpfiles[1306]: ACLs are not supported, ignoring. Mar 25 01:32:15.531114 systemd-tmpfiles[1306]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:32:15.531132 systemd-tmpfiles[1306]: Skipping /boot Mar 25 01:32:15.532989 systemd-udevd[1309]: Using default interface naming scheme 'v255'. Mar 25 01:32:15.575029 systemd-tmpfiles[1306]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:32:15.575049 systemd-tmpfiles[1306]: Skipping /boot Mar 25 01:32:15.622556 zram_generator::config[1345]: No configuration found. Mar 25 01:32:15.774950 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1371) Mar 25 01:32:15.873047 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:32:15.948206 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 25 01:32:15.981307 kernel: ACPI: button: Power Button [PWRF] Mar 25 01:32:16.009510 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:32:16.009865 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:32:16.010936 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 01:32:16.011500 systemd[1]: Reloading finished in 532 ms. Mar 25 01:32:16.026820 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:32:16.044630 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:32:16.075942 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 25 01:32:16.076778 systemd[1]: Finished ensure-sysext.service. Mar 25 01:32:16.105931 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 25 01:32:16.115206 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 25 01:32:16.115509 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 25 01:32:16.128579 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:32:16.130623 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:32:16.135780 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:32:16.137220 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:32:16.141007 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:32:16.150134 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:32:16.159636 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:32:16.168445 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:32:16.169833 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:32:16.187657 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:32:16.188588 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:32:16.204354 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:32:16.209182 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:32:16.215268 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:32:16.234087 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:32:16.240162 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:32:16.240875 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:32:16.243552 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:32:16.243873 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:32:16.245281 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:32:16.245880 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:32:16.247123 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:32:16.247737 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:32:16.249085 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:32:16.249540 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:32:16.257848 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:32:16.258160 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:32:16.260962 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:32:16.275220 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:32:16.280303 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:32:16.319229 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:32:16.326063 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:32:16.332676 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:32:16.360979 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:32:16.382174 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:32:16.386909 augenrules[1463]: No rules Mar 25 01:32:16.388413 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:32:16.388714 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:32:16.429401 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:32:16.468337 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:32:16.628017 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:32:16.631484 systemd-networkd[1432]: lo: Link UP Mar 25 01:32:16.631497 systemd-networkd[1432]: lo: Gained carrier Mar 25 01:32:16.637099 systemd-networkd[1432]: Enumeration completed Mar 25 01:32:16.637696 systemd-networkd[1432]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:32:16.637708 systemd-networkd[1432]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:32:16.639023 systemd-timesyncd[1436]: No network connectivity, watching for changes. Mar 25 01:32:16.641007 systemd-networkd[1432]: eth0: Link UP Mar 25 01:32:16.641019 systemd-networkd[1432]: eth0: Gained carrier Mar 25 01:32:16.641040 systemd-networkd[1432]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:32:16.648803 systemd-resolved[1433]: Positive Trust Anchors: Mar 25 01:32:16.649254 systemd-resolved[1433]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:32:16.649330 systemd-resolved[1433]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:32:16.655670 systemd-resolved[1433]: Using system hostname 'srv-y0b1r.gb1.brightbox.com'. Mar 25 01:32:16.663293 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:32:16.664291 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:32:16.665266 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:32:16.666516 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:32:16.668299 systemd[1]: Reached target network.target - Network. Mar 25 01:32:16.669017 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:32:16.670119 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:32:16.672609 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:32:16.673120 systemd-networkd[1432]: eth0: DHCPv4 address 10.243.75.178/30, gateway 10.243.75.177 acquired from 10.243.75.177 Mar 25 01:32:16.675448 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:32:16.677051 systemd-timesyncd[1436]: Network configuration changed, trying to establish connection. Mar 25 01:32:16.679297 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:32:16.705437 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:32:16.709403 lvm[1485]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:32:16.744420 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:32:16.746126 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:32:16.746949 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:32:16.747788 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:32:16.748647 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:32:16.749668 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:32:16.750707 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:32:16.751511 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:32:16.752287 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:32:16.752329 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:32:16.752957 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:32:16.754684 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:32:16.757252 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:32:16.762056 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:32:16.763058 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:32:16.763816 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:32:16.773658 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:32:16.775348 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:32:16.777767 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:32:16.779319 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:32:16.780144 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:32:16.780758 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:32:16.781449 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:32:16.781501 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:32:16.785031 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:32:16.789183 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:32:16.792202 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:32:16.797060 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:32:16.799604 lvm[1492]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:32:16.802589 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:32:16.804029 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:32:16.808336 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:32:16.820776 jq[1496]: false Mar 25 01:32:16.816170 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:32:16.824171 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:32:16.830634 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:32:16.841446 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:32:16.843248 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:32:16.851980 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:32:16.857142 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:32:16.861061 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:32:16.875559 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:32:16.876304 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:32:16.881270 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:32:16.881764 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:32:16.891698 extend-filesystems[1497]: Found loop4 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found loop5 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found loop6 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found loop7 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda1 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda2 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda3 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found usr Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda4 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda6 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda7 Mar 25 01:32:16.897631 extend-filesystems[1497]: Found vda9 Mar 25 01:32:16.897631 extend-filesystems[1497]: Checking size of /dev/vda9 Mar 25 01:32:16.894649 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:32:17.941669 systemd-timesyncd[1436]: Contacted time server 85.199.214.100:123 (3.flatcar.pool.ntp.org). Mar 25 01:32:17.941747 systemd-timesyncd[1436]: Initial clock synchronization to Tue 2025-03-25 01:32:17.941511 UTC. Mar 25 01:32:17.943334 systemd-resolved[1433]: Clock change detected. Flushing caches. Mar 25 01:32:17.953335 jq[1507]: true Mar 25 01:32:17.954879 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:32:17.954009 dbus-daemon[1495]: [system] SELinux support is enabled Mar 25 01:32:17.963600 dbus-daemon[1495]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1432 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 25 01:32:17.973850 update_engine[1505]: I20250325 01:32:17.973362 1505 main.cc:92] Flatcar Update Engine starting Mar 25 01:32:17.985955 update_engine[1505]: I20250325 01:32:17.976938 1505 update_check_scheduler.cc:74] Next update check in 10m47s Mar 25 01:32:17.980354 dbus-daemon[1495]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 25 01:32:17.986205 tar[1511]: linux-amd64/helm Mar 25 01:32:17.978860 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:32:17.978919 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:32:17.979745 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:32:17.979774 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:32:17.980940 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:32:17.993007 (ntainerd)[1528]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:32:17.994064 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 25 01:32:18.003454 extend-filesystems[1497]: Resized partition /dev/vda9 Mar 25 01:32:18.009452 extend-filesystems[1535]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:32:18.025468 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 25 01:32:18.035742 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:32:18.039342 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:32:18.039680 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:32:18.060094 jq[1527]: true Mar 25 01:32:18.088724 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:32:18.132481 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1370) Mar 25 01:32:18.344963 systemd-logind[1504]: Watching system buttons on /dev/input/event2 (Power Button) Mar 25 01:32:18.347473 systemd-logind[1504]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 25 01:32:18.349668 systemd-logind[1504]: New seat seat0. Mar 25 01:32:18.352451 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:32:18.410761 bash[1558]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:32:18.412193 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:32:18.423943 systemd[1]: Starting sshkeys.service... Mar 25 01:32:18.434485 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 25 01:32:18.457791 extend-filesystems[1535]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 25 01:32:18.457791 extend-filesystems[1535]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 25 01:32:18.457791 extend-filesystems[1535]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 25 01:32:18.486685 extend-filesystems[1497]: Resized filesystem in /dev/vda9 Mar 25 01:32:18.467193 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:32:18.468202 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:32:18.509404 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:32:18.515952 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:32:18.535163 containerd[1528]: time="2025-03-25T01:32:18Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:32:18.546572 containerd[1528]: time="2025-03-25T01:32:18.543988820Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:32:18.555611 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 25 01:32:18.561804 dbus-daemon[1495]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 25 01:32:18.562762 dbus-daemon[1495]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1531 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 25 01:32:18.569935 systemd[1]: Starting polkit.service - Authorization Manager... Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.599755235Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="14.553µs" Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.601522903Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.601579081Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.601941614Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.601989383Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.603109143Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.603286699Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:32:18.603418 containerd[1528]: time="2025-03-25T01:32:18.603321670Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:32:18.606999 locksmithd[1534]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.607583216Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.607620060Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.607654646Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.607670061Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.607862986Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.608295896Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.608358721Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.608378157Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.608455643Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.608782097Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:32:18.612146 containerd[1528]: time="2025-03-25T01:32:18.608885395Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:32:18.622834 containerd[1528]: time="2025-03-25T01:32:18.622785200Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:32:18.623048 containerd[1528]: time="2025-03-25T01:32:18.623021388Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:32:18.623864 containerd[1528]: time="2025-03-25T01:32:18.623823119Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:32:18.624298 containerd[1528]: time="2025-03-25T01:32:18.624264862Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626137724Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626172308Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626215412Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626237839Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626286608Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626329184Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626370988Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626394839Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626809175Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626859525Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626894782Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626914805Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626932180Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:32:18.627759 containerd[1528]: time="2025-03-25T01:32:18.626948626Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.626967478Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.626990095Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.627018213Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.627040150Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.627059323Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.627196468Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.627243525Z" level=info msg="Start snapshots syncer" Mar 25 01:32:18.628315 containerd[1528]: time="2025-03-25T01:32:18.627290868Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:32:18.630551 polkitd[1569]: Started polkitd version 121 Mar 25 01:32:18.634289 containerd[1528]: time="2025-03-25T01:32:18.633283563Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:32:18.634289 containerd[1528]: time="2025-03-25T01:32:18.633392333Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:32:18.634556 containerd[1528]: time="2025-03-25T01:32:18.633647996Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:32:18.634556 containerd[1528]: time="2025-03-25T01:32:18.633930173Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:32:18.634556 containerd[1528]: time="2025-03-25T01:32:18.634177966Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:32:18.634556 containerd[1528]: time="2025-03-25T01:32:18.634199967Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:32:18.634556 containerd[1528]: time="2025-03-25T01:32:18.634218064Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:32:18.634556 containerd[1528]: time="2025-03-25T01:32:18.634245267Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:32:18.635727 containerd[1528]: time="2025-03-25T01:32:18.634269389Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:32:18.635727 containerd[1528]: time="2025-03-25T01:32:18.635495001Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:32:18.635727 containerd[1528]: time="2025-03-25T01:32:18.635544273Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:32:18.635727 containerd[1528]: time="2025-03-25T01:32:18.635569191Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:32:18.635727 containerd[1528]: time="2025-03-25T01:32:18.635586250Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:32:18.635727 containerd[1528]: time="2025-03-25T01:32:18.635641551Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.635986464Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636011070Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636031386Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636046577Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636078615Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636099786Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636148521Z" level=info msg="runtime interface created" Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636166108Z" level=info msg="created NRI interface" Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636190139Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636215827Z" level=info msg="Connect containerd service" Mar 25 01:32:18.636464 containerd[1528]: time="2025-03-25T01:32:18.636265148Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:32:18.643314 containerd[1528]: time="2025-03-25T01:32:18.641165643Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:32:18.650288 polkitd[1569]: Loading rules from directory /etc/polkit-1/rules.d Mar 25 01:32:18.656260 polkitd[1569]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 25 01:32:18.836085 sshd_keygen[1525]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:32:18.859714 containerd[1528]: time="2025-03-25T01:32:18.859605875Z" level=info msg="Start subscribing containerd event" Mar 25 01:32:18.859958 containerd[1528]: time="2025-03-25T01:32:18.859900479Z" level=info msg="Start recovering state" Mar 25 01:32:18.860203 containerd[1528]: time="2025-03-25T01:32:18.860179475Z" level=info msg="Start event monitor" Mar 25 01:32:18.860318 containerd[1528]: time="2025-03-25T01:32:18.860296656Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:32:18.861498 containerd[1528]: time="2025-03-25T01:32:18.861473390Z" level=info msg="Start streaming server" Mar 25 01:32:18.861634 containerd[1528]: time="2025-03-25T01:32:18.861609387Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:32:18.861746 containerd[1528]: time="2025-03-25T01:32:18.861726300Z" level=info msg="runtime interface starting up..." Mar 25 01:32:18.864743 containerd[1528]: time="2025-03-25T01:32:18.861810530Z" level=info msg="starting plugins..." Mar 25 01:32:18.864743 containerd[1528]: time="2025-03-25T01:32:18.861866642Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:32:18.864743 containerd[1528]: time="2025-03-25T01:32:18.860955624Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:32:18.864743 containerd[1528]: time="2025-03-25T01:32:18.862110324Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:32:18.862333 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:32:18.866842 containerd[1528]: time="2025-03-25T01:32:18.865473103Z" level=info msg="containerd successfully booted in 0.330859s" Mar 25 01:32:18.885025 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:32:18.891951 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:32:18.897629 systemd[1]: Started sshd@0-10.243.75.178:22-139.178.68.195:42042.service - OpenSSH per-connection server daemon (139.178.68.195:42042). Mar 25 01:32:18.915936 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:32:18.916226 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:32:18.924751 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:32:18.964311 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:32:18.969000 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:32:18.971855 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:32:18.974269 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:32:19.036935 tar[1511]: linux-amd64/LICENSE Mar 25 01:32:19.037488 tar[1511]: linux-amd64/README.md Mar 25 01:32:19.046731 systemd-networkd[1432]: eth0: Gained IPv6LL Mar 25 01:32:19.052333 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:32:19.054238 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:32:19.056415 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:32:19.059726 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:32:19.062021 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:32:19.098535 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:32:19.831593 sshd[1600]: Accepted publickey for core from 139.178.68.195 port 42042 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:19.834043 sshd-session[1600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:19.846713 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:32:19.849251 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:32:19.865736 systemd-logind[1504]: New session 1 of user core. Mar 25 01:32:19.883838 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:32:19.889046 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:32:19.907364 (systemd)[1627]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:32:19.913534 systemd-logind[1504]: New session c1 of user core. Mar 25 01:32:19.979599 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:32:19.991109 (kubelet)[1637]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:32:20.100609 systemd[1627]: Queued start job for default target default.target. Mar 25 01:32:20.107734 systemd[1627]: Created slice app.slice - User Application Slice. Mar 25 01:32:20.107787 systemd[1627]: Reached target paths.target - Paths. Mar 25 01:32:20.107892 systemd[1627]: Reached target timers.target - Timers. Mar 25 01:32:20.110290 systemd[1627]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:32:20.138471 systemd[1627]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:32:20.138721 systemd[1627]: Reached target sockets.target - Sockets. Mar 25 01:32:20.138800 systemd[1627]: Reached target basic.target - Basic System. Mar 25 01:32:20.138894 systemd[1627]: Reached target default.target - Main User Target. Mar 25 01:32:20.138976 systemd[1627]: Startup finished in 215ms. Mar 25 01:32:20.139225 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:32:20.149735 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:32:20.555598 systemd-networkd[1432]: eth0: Ignoring DHCPv6 address 2a02:1348:17c:d2ec:24:19ff:fef3:4bb2/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:17c:d2ec:24:19ff:fef3:4bb2/64 assigned by NDisc. Mar 25 01:32:20.556074 systemd-networkd[1432]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 25 01:32:20.656526 kubelet[1637]: E0325 01:32:20.656402 1637 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:32:20.659487 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:32:20.659987 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:32:20.660711 systemd[1]: kubelet.service: Consumed 984ms CPU time, 243M memory peak. Mar 25 01:32:20.783200 systemd[1]: Started sshd@1-10.243.75.178:22-139.178.68.195:39660.service - OpenSSH per-connection server daemon (139.178.68.195:39660). Mar 25 01:32:21.685277 sshd[1651]: Accepted publickey for core from 139.178.68.195 port 39660 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:21.687296 sshd-session[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:21.694057 systemd-logind[1504]: New session 2 of user core. Mar 25 01:32:21.702730 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:32:22.304367 sshd[1655]: Connection closed by 139.178.68.195 port 39660 Mar 25 01:32:22.305213 sshd-session[1651]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:22.309571 systemd-logind[1504]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:32:22.311059 systemd[1]: sshd@1-10.243.75.178:22-139.178.68.195:39660.service: Deactivated successfully. Mar 25 01:32:22.316733 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:32:22.318267 systemd-logind[1504]: Removed session 2. Mar 25 01:32:22.462060 systemd[1]: Started sshd@2-10.243.75.178:22-139.178.68.195:39662.service - OpenSSH per-connection server daemon (139.178.68.195:39662). Mar 25 01:32:23.361552 sshd[1661]: Accepted publickey for core from 139.178.68.195 port 39662 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:23.363863 sshd-session[1661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:23.370042 systemd-logind[1504]: New session 3 of user core. Mar 25 01:32:23.378692 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:32:23.981731 sshd[1663]: Connection closed by 139.178.68.195 port 39662 Mar 25 01:32:23.981592 sshd-session[1661]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:23.986662 systemd[1]: sshd@2-10.243.75.178:22-139.178.68.195:39662.service: Deactivated successfully. Mar 25 01:32:23.989873 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:32:23.991400 systemd-logind[1504]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:32:23.994100 systemd-logind[1504]: Removed session 3. Mar 25 01:32:24.058882 login[1607]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 01:32:24.062305 login[1608]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 01:32:24.068077 systemd-logind[1504]: New session 4 of user core. Mar 25 01:32:24.075788 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:32:24.079862 systemd-logind[1504]: New session 5 of user core. Mar 25 01:32:24.084756 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:32:24.938282 coreos-metadata[1494]: Mar 25 01:32:24.938 WARN failed to locate config-drive, using the metadata service API instead Mar 25 01:32:24.965286 coreos-metadata[1494]: Mar 25 01:32:24.965 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 25 01:32:24.971204 coreos-metadata[1494]: Mar 25 01:32:24.971 INFO Fetch failed with 404: resource not found Mar 25 01:32:24.971268 coreos-metadata[1494]: Mar 25 01:32:24.971 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 25 01:32:24.971931 coreos-metadata[1494]: Mar 25 01:32:24.971 INFO Fetch successful Mar 25 01:32:24.972093 coreos-metadata[1494]: Mar 25 01:32:24.972 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 25 01:32:24.985533 coreos-metadata[1494]: Mar 25 01:32:24.985 INFO Fetch successful Mar 25 01:32:24.985651 coreos-metadata[1494]: Mar 25 01:32:24.985 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 25 01:32:24.999514 coreos-metadata[1494]: Mar 25 01:32:24.999 INFO Fetch successful Mar 25 01:32:24.999648 coreos-metadata[1494]: Mar 25 01:32:24.999 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 25 01:32:25.019573 coreos-metadata[1494]: Mar 25 01:32:25.019 INFO Fetch successful Mar 25 01:32:25.019754 coreos-metadata[1494]: Mar 25 01:32:25.019 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 25 01:32:25.036450 coreos-metadata[1494]: Mar 25 01:32:25.036 INFO Fetch successful Mar 25 01:32:25.066353 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:32:25.068301 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:32:25.665924 coreos-metadata[1567]: Mar 25 01:32:25.665 WARN failed to locate config-drive, using the metadata service API instead Mar 25 01:32:25.687482 coreos-metadata[1567]: Mar 25 01:32:25.687 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 25 01:32:25.709703 coreos-metadata[1567]: Mar 25 01:32:25.709 INFO Fetch successful Mar 25 01:32:25.709886 coreos-metadata[1567]: Mar 25 01:32:25.709 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 25 01:32:25.741761 coreos-metadata[1567]: Mar 25 01:32:25.741 INFO Fetch successful Mar 25 01:32:25.743288 unknown[1567]: wrote ssh authorized keys file for user: core Mar 25 01:32:25.765421 update-ssh-keys[1704]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:32:25.766383 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:32:25.768751 systemd[1]: Finished sshkeys.service. Mar 25 01:32:25.772035 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:32:30.910364 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:32:30.912743 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:32:31.059085 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:32:31.080112 (kubelet)[1716]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:32:31.156219 kubelet[1716]: E0325 01:32:31.156057 1716 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:32:31.160377 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:32:31.160683 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:32:31.161315 systemd[1]: kubelet.service: Consumed 188ms CPU time, 96.2M memory peak. Mar 25 01:32:33.656823 polkitd[1569]: Terminating runaway script after 15 seconds Mar 25 01:32:33.657737 polkitd[1569]: Finished loading, compiling and executing 2 rules Mar 25 01:32:33.659287 dbus-daemon[1495]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 25 01:32:33.660242 polkitd[1569]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 25 01:32:33.659646 systemd[1]: Started polkit.service - Authorization Manager. Mar 25 01:32:33.660632 systemd[1]: Startup finished in 1.265s (kernel) + 23.199s (initrd) + 19.737s (userspace) = 44.203s. Mar 25 01:32:33.678259 systemd-hostnamed[1531]: Hostname set to (static) Mar 25 01:32:34.137675 systemd[1]: Started sshd@3-10.243.75.178:22-139.178.68.195:59670.service - OpenSSH per-connection server daemon (139.178.68.195:59670). Mar 25 01:32:35.049222 sshd[1729]: Accepted publickey for core from 139.178.68.195 port 59670 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:35.051095 sshd-session[1729]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:35.057415 systemd-logind[1504]: New session 6 of user core. Mar 25 01:32:35.068704 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:32:35.671983 sshd[1731]: Connection closed by 139.178.68.195 port 59670 Mar 25 01:32:35.672935 sshd-session[1729]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:35.677169 systemd-logind[1504]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:32:35.678301 systemd[1]: sshd@3-10.243.75.178:22-139.178.68.195:59670.service: Deactivated successfully. Mar 25 01:32:35.680938 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:32:35.682329 systemd-logind[1504]: Removed session 6. Mar 25 01:32:35.826576 systemd[1]: Started sshd@4-10.243.75.178:22-139.178.68.195:38436.service - OpenSSH per-connection server daemon (139.178.68.195:38436). Mar 25 01:32:36.731392 sshd[1737]: Accepted publickey for core from 139.178.68.195 port 38436 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:36.733341 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:36.740552 systemd-logind[1504]: New session 7 of user core. Mar 25 01:32:36.747679 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:32:37.345348 sshd[1739]: Connection closed by 139.178.68.195 port 38436 Mar 25 01:32:37.345212 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:37.348996 systemd[1]: sshd@4-10.243.75.178:22-139.178.68.195:38436.service: Deactivated successfully. Mar 25 01:32:37.351323 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:32:37.352598 systemd-logind[1504]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:32:37.354558 systemd-logind[1504]: Removed session 7. Mar 25 01:32:37.505760 systemd[1]: Started sshd@5-10.243.75.178:22-139.178.68.195:38448.service - OpenSSH per-connection server daemon (139.178.68.195:38448). Mar 25 01:32:38.412657 sshd[1745]: Accepted publickey for core from 139.178.68.195 port 38448 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:38.414647 sshd-session[1745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:38.423443 systemd-logind[1504]: New session 8 of user core. Mar 25 01:32:38.430717 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:32:39.036005 sshd[1747]: Connection closed by 139.178.68.195 port 38448 Mar 25 01:32:39.035786 sshd-session[1745]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:39.040769 systemd-logind[1504]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:32:39.041895 systemd[1]: sshd@5-10.243.75.178:22-139.178.68.195:38448.service: Deactivated successfully. Mar 25 01:32:39.043881 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:32:39.045164 systemd-logind[1504]: Removed session 8. Mar 25 01:32:39.190332 systemd[1]: Started sshd@6-10.243.75.178:22-139.178.68.195:38456.service - OpenSSH per-connection server daemon (139.178.68.195:38456). Mar 25 01:32:40.099250 sshd[1753]: Accepted publickey for core from 139.178.68.195 port 38456 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:40.101071 sshd-session[1753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:40.108524 systemd-logind[1504]: New session 9 of user core. Mar 25 01:32:40.115635 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:32:40.591292 sudo[1756]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:32:40.592656 sudo[1756]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:32:40.611677 sudo[1756]: pam_unix(sudo:session): session closed for user root Mar 25 01:32:40.755819 sshd[1755]: Connection closed by 139.178.68.195 port 38456 Mar 25 01:32:40.756908 sshd-session[1753]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:40.760855 systemd[1]: sshd@6-10.243.75.178:22-139.178.68.195:38456.service: Deactivated successfully. Mar 25 01:32:40.763782 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:32:40.765857 systemd-logind[1504]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:32:40.767370 systemd-logind[1504]: Removed session 9. Mar 25 01:32:40.910761 systemd[1]: Started sshd@7-10.243.75.178:22-139.178.68.195:38468.service - OpenSSH per-connection server daemon (139.178.68.195:38468). Mar 25 01:32:41.411296 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:32:41.413784 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:32:41.573166 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:32:41.588087 (kubelet)[1772]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:32:41.673175 kubelet[1772]: E0325 01:32:41.673039 1772 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:32:41.675476 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:32:41.675712 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:32:41.676309 systemd[1]: kubelet.service: Consumed 182ms CPU time, 98.5M memory peak. Mar 25 01:32:41.817913 sshd[1762]: Accepted publickey for core from 139.178.68.195 port 38468 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:41.819801 sshd-session[1762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:41.827667 systemd-logind[1504]: New session 10 of user core. Mar 25 01:32:41.835637 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:32:42.296219 sudo[1782]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:32:42.297238 sudo[1782]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:32:42.302793 sudo[1782]: pam_unix(sudo:session): session closed for user root Mar 25 01:32:42.310547 sudo[1781]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:32:42.310951 sudo[1781]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:32:42.325161 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:32:42.378669 augenrules[1804]: No rules Mar 25 01:32:42.380296 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:32:42.380725 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:32:42.382077 sudo[1781]: pam_unix(sudo:session): session closed for user root Mar 25 01:32:42.526036 sshd[1780]: Connection closed by 139.178.68.195 port 38468 Mar 25 01:32:42.526542 sshd-session[1762]: pam_unix(sshd:session): session closed for user core Mar 25 01:32:42.530873 systemd-logind[1504]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:32:42.531350 systemd[1]: sshd@7-10.243.75.178:22-139.178.68.195:38468.service: Deactivated successfully. Mar 25 01:32:42.533590 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:32:42.535666 systemd-logind[1504]: Removed session 10. Mar 25 01:32:42.680630 systemd[1]: Started sshd@8-10.243.75.178:22-139.178.68.195:38476.service - OpenSSH per-connection server daemon (139.178.68.195:38476). Mar 25 01:32:43.584831 sshd[1813]: Accepted publickey for core from 139.178.68.195 port 38476 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:32:43.586701 sshd-session[1813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:32:43.594126 systemd-logind[1504]: New session 11 of user core. Mar 25 01:32:43.602756 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:32:44.060995 sudo[1816]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:32:44.061444 sudo[1816]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:32:44.580721 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:32:44.595974 (dockerd)[1833]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:32:44.997364 dockerd[1833]: time="2025-03-25T01:32:44.997075669Z" level=info msg="Starting up" Mar 25 01:32:45.001989 dockerd[1833]: time="2025-03-25T01:32:45.001785763Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:32:45.042410 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2942917230-merged.mount: Deactivated successfully. Mar 25 01:32:45.074028 dockerd[1833]: time="2025-03-25T01:32:45.073967453Z" level=info msg="Loading containers: start." Mar 25 01:32:45.286558 kernel: Initializing XFRM netlink socket Mar 25 01:32:45.377810 systemd-networkd[1432]: docker0: Link UP Mar 25 01:32:45.448247 dockerd[1833]: time="2025-03-25T01:32:45.448152370Z" level=info msg="Loading containers: done." Mar 25 01:32:45.465987 dockerd[1833]: time="2025-03-25T01:32:45.465925958Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:32:45.466160 dockerd[1833]: time="2025-03-25T01:32:45.466049011Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:32:45.466282 dockerd[1833]: time="2025-03-25T01:32:45.466236823Z" level=info msg="Daemon has completed initialization" Mar 25 01:32:45.501899 dockerd[1833]: time="2025-03-25T01:32:45.501812122Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:32:45.502386 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:32:46.037542 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1446736997-merged.mount: Deactivated successfully. Mar 25 01:32:46.734270 containerd[1528]: time="2025-03-25T01:32:46.734124654Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 25 01:32:47.666207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount537725326.mount: Deactivated successfully. Mar 25 01:32:50.223244 containerd[1528]: time="2025-03-25T01:32:50.223063031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:50.225299 containerd[1528]: time="2025-03-25T01:32:50.225121159Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674581" Mar 25 01:32:50.225299 containerd[1528]: time="2025-03-25T01:32:50.225225355Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:50.228720 containerd[1528]: time="2025-03-25T01:32:50.228652348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:50.230608 containerd[1528]: time="2025-03-25T01:32:50.230153674Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 3.495867851s" Mar 25 01:32:50.230608 containerd[1528]: time="2025-03-25T01:32:50.230225087Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 25 01:32:50.258585 containerd[1528]: time="2025-03-25T01:32:50.258403133Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 25 01:32:51.920597 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:32:51.925645 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:32:52.306732 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:32:52.316135 (kubelet)[2118]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:32:52.426457 kubelet[2118]: E0325 01:32:52.425311 2118 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:32:52.429504 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:32:52.429919 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:32:52.430631 systemd[1]: kubelet.service: Consumed 203ms CPU time, 97.5M memory peak. Mar 25 01:32:53.399579 containerd[1528]: time="2025-03-25T01:32:53.398396194Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:53.401314 containerd[1528]: time="2025-03-25T01:32:53.401196776Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619780" Mar 25 01:32:53.402068 containerd[1528]: time="2025-03-25T01:32:53.401989126Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:53.405128 containerd[1528]: time="2025-03-25T01:32:53.405029373Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:53.407157 containerd[1528]: time="2025-03-25T01:32:53.406474565Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 3.147629644s" Mar 25 01:32:53.407157 containerd[1528]: time="2025-03-25T01:32:53.406552402Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 25 01:32:53.433205 containerd[1528]: time="2025-03-25T01:32:53.432946192Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 25 01:32:55.320401 containerd[1528]: time="2025-03-25T01:32:55.320296697Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:55.321902 containerd[1528]: time="2025-03-25T01:32:55.321835357Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903317" Mar 25 01:32:55.322802 containerd[1528]: time="2025-03-25T01:32:55.322747671Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:55.325869 containerd[1528]: time="2025-03-25T01:32:55.325838510Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:55.327401 containerd[1528]: time="2025-03-25T01:32:55.327217326Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.89362073s" Mar 25 01:32:55.327401 containerd[1528]: time="2025-03-25T01:32:55.327259404Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 25 01:32:55.353165 containerd[1528]: time="2025-03-25T01:32:55.353118971Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 25 01:32:55.945375 systemd[1]: Started sshd@9-10.243.75.178:22-218.92.0.228:16658.service - OpenSSH per-connection server daemon (218.92.0.228:16658). Mar 25 01:32:56.932061 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount575048695.mount: Deactivated successfully. Mar 25 01:32:57.576229 containerd[1528]: time="2025-03-25T01:32:57.576107941Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:57.578003 containerd[1528]: time="2025-03-25T01:32:57.577892541Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185380" Mar 25 01:32:57.579186 containerd[1528]: time="2025-03-25T01:32:57.579107375Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:57.581550 containerd[1528]: time="2025-03-25T01:32:57.581484423Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:57.583309 containerd[1528]: time="2025-03-25T01:32:57.582736269Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 2.22955893s" Mar 25 01:32:57.583309 containerd[1528]: time="2025-03-25T01:32:57.582808867Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 25 01:32:57.608195 containerd[1528]: time="2025-03-25T01:32:57.608074611Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:32:58.183656 sshd-session[2162]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:32:58.214119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2375133737.mount: Deactivated successfully. Mar 25 01:32:59.650376 containerd[1528]: time="2025-03-25T01:32:59.650311874Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:59.654974 containerd[1528]: time="2025-03-25T01:32:59.654909836Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Mar 25 01:32:59.656628 containerd[1528]: time="2025-03-25T01:32:59.656563215Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:59.660129 containerd[1528]: time="2025-03-25T01:32:59.659505750Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:32:59.660817 containerd[1528]: time="2025-03-25T01:32:59.660779662Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.052514456s" Mar 25 01:32:59.660898 containerd[1528]: time="2025-03-25T01:32:59.660820023Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 25 01:32:59.691001 containerd[1528]: time="2025-03-25T01:32:59.690925431Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 25 01:32:59.814504 sshd[2145]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:00.263276 sshd-session[2213]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:00.281668 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount571052393.mount: Deactivated successfully. Mar 25 01:33:00.285983 containerd[1528]: time="2025-03-25T01:33:00.285916631Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:00.287463 containerd[1528]: time="2025-03-25T01:33:00.287329475Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Mar 25 01:33:00.288434 containerd[1528]: time="2025-03-25T01:33:00.288380040Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:00.291126 containerd[1528]: time="2025-03-25T01:33:00.291057702Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:00.293758 containerd[1528]: time="2025-03-25T01:33:00.292079433Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 600.706551ms" Mar 25 01:33:00.293758 containerd[1528]: time="2025-03-25T01:33:00.292123778Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 25 01:33:00.319757 containerd[1528]: time="2025-03-25T01:33:00.319539183Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 25 01:33:00.946327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1125462345.mount: Deactivated successfully. Mar 25 01:33:02.639957 sshd[2145]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:02.668740 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 25 01:33:02.673053 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:33:03.090219 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:03.102939 (kubelet)[2273]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:33:03.179173 kubelet[2273]: E0325 01:33:03.179047 2273 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:33:03.182055 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:33:03.182344 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:33:03.183240 systemd[1]: kubelet.service: Consumed 233ms CPU time, 97M memory peak. Mar 25 01:33:03.459959 update_engine[1505]: I20250325 01:33:03.459678 1505 update_attempter.cc:509] Updating boot flags... Mar 25 01:33:03.517480 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2289) Mar 25 01:33:03.536826 sshd-session[2281]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:03.632513 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2292) Mar 25 01:33:03.685295 systemd[1]: Started sshd@10-10.243.75.178:22-218.92.0.221:46308.service - OpenSSH per-connection server daemon (218.92.0.221:46308). Mar 25 01:33:03.720826 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 25 01:33:05.658613 sshd[2145]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:05.873549 sshd[2145]: Received disconnect from 218.92.0.228 port 16658:11: [preauth] Mar 25 01:33:05.873549 sshd[2145]: Disconnected from authenticating user root 218.92.0.228 port 16658 [preauth] Mar 25 01:33:05.878822 systemd[1]: sshd@9-10.243.75.178:22-218.92.0.228:16658.service: Deactivated successfully. Mar 25 01:33:05.981525 containerd[1528]: time="2025-03-25T01:33:05.981405763Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:05.983128 containerd[1528]: time="2025-03-25T01:33:05.983031542Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" Mar 25 01:33:05.983983 containerd[1528]: time="2025-03-25T01:33:05.983814046Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:05.996347 containerd[1528]: time="2025-03-25T01:33:05.996293259Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:06.002639 containerd[1528]: time="2025-03-25T01:33:06.000358294Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 5.680758852s" Mar 25 01:33:06.002639 containerd[1528]: time="2025-03-25T01:33:06.000421031Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 25 01:33:06.079048 systemd[1]: Started sshd@11-10.243.75.178:22-218.92.0.228:33582.service - OpenSSH per-connection server daemon (218.92.0.228:33582). Mar 25 01:33:06.183199 sshd-session[2312]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:07.912777 sshd[2297]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:08.300211 sshd-session[2390]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:08.772280 sshd-session[2391]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:09.658380 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:09.658697 systemd[1]: kubelet.service: Consumed 233ms CPU time, 97M memory peak. Mar 25 01:33:09.662776 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:33:09.691984 systemd[1]: Reload requested from client PID 2398 ('systemctl') (unit session-11.scope)... Mar 25 01:33:09.692040 systemd[1]: Reloading... Mar 25 01:33:09.880717 zram_generator::config[2448]: No configuration found. Mar 25 01:33:10.043685 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:33:10.195373 systemd[1]: Reloading finished in 502 ms. Mar 25 01:33:10.273990 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:10.280080 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:33:10.281159 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:33:10.281638 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:10.281718 systemd[1]: kubelet.service: Consumed 136ms CPU time, 83.5M memory peak. Mar 25 01:33:10.284832 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:33:10.432299 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:10.441776 sshd[2320]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:10.445099 (kubelet)[2517]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:33:10.551600 kubelet[2517]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:33:10.551600 kubelet[2517]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:33:10.551600 kubelet[2517]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:33:10.552983 kubelet[2517]: I0325 01:33:10.552836 2517 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:33:10.828558 sshd-session[2525]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:10.914988 sshd[2297]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:11.323044 sshd-session[2526]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:11.329295 kubelet[2517]: I0325 01:33:11.329235 2517 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:33:11.329295 kubelet[2517]: I0325 01:33:11.329272 2517 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:33:11.329591 kubelet[2517]: I0325 01:33:11.329561 2517 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:33:11.351231 kubelet[2517]: I0325 01:33:11.350700 2517 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:33:11.353646 kubelet[2517]: E0325 01:33:11.353249 2517 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.243.75.178:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.373791 kubelet[2517]: I0325 01:33:11.373759 2517 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:33:11.378639 kubelet[2517]: I0325 01:33:11.378191 2517 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:33:11.378639 kubelet[2517]: I0325 01:33:11.378259 2517 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-y0b1r.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:33:11.379379 kubelet[2517]: I0325 01:33:11.379353 2517 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:33:11.380008 kubelet[2517]: I0325 01:33:11.379589 2517 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:33:11.380008 kubelet[2517]: I0325 01:33:11.379800 2517 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:33:11.381132 kubelet[2517]: I0325 01:33:11.380754 2517 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:33:11.381132 kubelet[2517]: I0325 01:33:11.380788 2517 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:33:11.381132 kubelet[2517]: I0325 01:33:11.380841 2517 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:33:11.381132 kubelet[2517]: I0325 01:33:11.380878 2517 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:33:11.384247 kubelet[2517]: W0325 01:33:11.384158 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.243.75.178:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-y0b1r.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.385317 kubelet[2517]: E0325 01:33:11.384358 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.243.75.178:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-y0b1r.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.385317 kubelet[2517]: W0325 01:33:11.384818 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.243.75.178:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.385317 kubelet[2517]: E0325 01:33:11.384884 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.243.75.178:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.385317 kubelet[2517]: I0325 01:33:11.385015 2517 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:33:11.388229 kubelet[2517]: I0325 01:33:11.386912 2517 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:33:11.388229 kubelet[2517]: W0325 01:33:11.387068 2517 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:33:11.389341 kubelet[2517]: I0325 01:33:11.388577 2517 server.go:1264] "Started kubelet" Mar 25 01:33:11.416720 kubelet[2517]: I0325 01:33:11.415500 2517 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:33:11.416720 kubelet[2517]: I0325 01:33:11.416517 2517 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:33:11.416993 kubelet[2517]: I0325 01:33:11.416894 2517 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:33:11.417383 kubelet[2517]: I0325 01:33:11.417356 2517 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:33:11.417777 kubelet[2517]: I0325 01:33:11.417752 2517 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:33:11.421811 kubelet[2517]: I0325 01:33:11.421791 2517 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:33:11.422337 kubelet[2517]: E0325 01:33:11.421723 2517 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.243.75.178:6443/api/v1/namespaces/default/events\": dial tcp 10.243.75.178:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-y0b1r.gb1.brightbox.com.182fe7bfb5ae445e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-y0b1r.gb1.brightbox.com,UID:srv-y0b1r.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-y0b1r.gb1.brightbox.com,},FirstTimestamp:2025-03-25 01:33:11.38853795 +0000 UTC m=+0.937987664,LastTimestamp:2025-03-25 01:33:11.38853795 +0000 UTC m=+0.937987664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-y0b1r.gb1.brightbox.com,}" Mar 25 01:33:11.434007 kubelet[2517]: I0325 01:33:11.433962 2517 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:33:11.434214 kubelet[2517]: I0325 01:33:11.434192 2517 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:33:11.434494 kubelet[2517]: E0325 01:33:11.434446 2517 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.75.178:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-y0b1r.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.75.178:6443: connect: connection refused" interval="200ms" Mar 25 01:33:11.434647 kubelet[2517]: W0325 01:33:11.434595 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.243.75.178:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.434716 kubelet[2517]: E0325 01:33:11.434654 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.243.75.178:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.437787 kubelet[2517]: I0325 01:33:11.437758 2517 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:33:11.437998 kubelet[2517]: I0325 01:33:11.437863 2517 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:33:11.439925 kubelet[2517]: I0325 01:33:11.439896 2517 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:33:11.458237 kubelet[2517]: E0325 01:33:11.457938 2517 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:33:11.472371 kubelet[2517]: I0325 01:33:11.472287 2517 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:33:11.478456 kubelet[2517]: I0325 01:33:11.478395 2517 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:33:11.478719 kubelet[2517]: I0325 01:33:11.478586 2517 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:33:11.478719 kubelet[2517]: I0325 01:33:11.478654 2517 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:33:11.479180 kubelet[2517]: E0325 01:33:11.479138 2517 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:33:11.480771 kubelet[2517]: W0325 01:33:11.480708 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.243.75.178:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.480956 kubelet[2517]: E0325 01:33:11.480877 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.243.75.178:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:11.482187 kubelet[2517]: I0325 01:33:11.482163 2517 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:33:11.482187 kubelet[2517]: I0325 01:33:11.482186 2517 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:33:11.482317 kubelet[2517]: I0325 01:33:11.482211 2517 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:33:11.484502 kubelet[2517]: I0325 01:33:11.484449 2517 policy_none.go:49] "None policy: Start" Mar 25 01:33:11.485495 kubelet[2517]: I0325 01:33:11.485272 2517 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:33:11.485495 kubelet[2517]: I0325 01:33:11.485316 2517 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:33:11.496093 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:33:11.511488 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:33:11.516393 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:33:11.525097 kubelet[2517]: I0325 01:33:11.525053 2517 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:33:11.525379 kubelet[2517]: I0325 01:33:11.525327 2517 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:33:11.525661 kubelet[2517]: I0325 01:33:11.525556 2517 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:33:11.525661 kubelet[2517]: I0325 01:33:11.525608 2517 kubelet_node_status.go:73] "Attempting to register node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.526555 kubelet[2517]: E0325 01:33:11.526248 2517 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.243.75.178:6443/api/v1/nodes\": dial tcp 10.243.75.178:6443: connect: connection refused" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.529449 kubelet[2517]: E0325 01:33:11.529329 2517 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-y0b1r.gb1.brightbox.com\" not found" Mar 25 01:33:11.581172 kubelet[2517]: I0325 01:33:11.579695 2517 topology_manager.go:215] "Topology Admit Handler" podUID="0982b4988025caeb91f95b38306a6938" podNamespace="kube-system" podName="kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.582360 kubelet[2517]: I0325 01:33:11.582209 2517 topology_manager.go:215] "Topology Admit Handler" podUID="7f4b9fe202ba7f7a3ab1ae3d4cd08969" podNamespace="kube-system" podName="kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.585261 kubelet[2517]: I0325 01:33:11.585220 2517 topology_manager.go:215] "Topology Admit Handler" podUID="fb9eb9823e97d6c912ef3c7d0b97e2d8" podNamespace="kube-system" podName="kube-scheduler-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.595541 systemd[1]: Created slice kubepods-burstable-pod0982b4988025caeb91f95b38306a6938.slice - libcontainer container kubepods-burstable-pod0982b4988025caeb91f95b38306a6938.slice. Mar 25 01:33:11.609566 systemd[1]: Created slice kubepods-burstable-pod7f4b9fe202ba7f7a3ab1ae3d4cd08969.slice - libcontainer container kubepods-burstable-pod7f4b9fe202ba7f7a3ab1ae3d4cd08969.slice. Mar 25 01:33:11.615703 systemd[1]: Created slice kubepods-burstable-podfb9eb9823e97d6c912ef3c7d0b97e2d8.slice - libcontainer container kubepods-burstable-podfb9eb9823e97d6c912ef3c7d0b97e2d8.slice. Mar 25 01:33:11.635505 kubelet[2517]: I0325 01:33:11.635322 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0982b4988025caeb91f95b38306a6938-k8s-certs\") pod \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" (UID: \"0982b4988025caeb91f95b38306a6938\") " pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635505 kubelet[2517]: E0325 01:33:11.635362 2517 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.75.178:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-y0b1r.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.75.178:6443: connect: connection refused" interval="400ms" Mar 25 01:33:11.635505 kubelet[2517]: I0325 01:33:11.635378 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-ca-certs\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635505 kubelet[2517]: I0325 01:33:11.635421 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-k8s-certs\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635505 kubelet[2517]: I0325 01:33:11.635502 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0982b4988025caeb91f95b38306a6938-ca-certs\") pod \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" (UID: \"0982b4988025caeb91f95b38306a6938\") " pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635842 kubelet[2517]: I0325 01:33:11.635530 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0982b4988025caeb91f95b38306a6938-usr-share-ca-certificates\") pod \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" (UID: \"0982b4988025caeb91f95b38306a6938\") " pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635842 kubelet[2517]: I0325 01:33:11.635554 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-flexvolume-dir\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635842 kubelet[2517]: I0325 01:33:11.635579 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-kubeconfig\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635842 kubelet[2517]: I0325 01:33:11.635603 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.635842 kubelet[2517]: I0325 01:33:11.635641 2517 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fb9eb9823e97d6c912ef3c7d0b97e2d8-kubeconfig\") pod \"kube-scheduler-srv-y0b1r.gb1.brightbox.com\" (UID: \"fb9eb9823e97d6c912ef3c7d0b97e2d8\") " pod="kube-system/kube-scheduler-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.729898 kubelet[2517]: I0325 01:33:11.729845 2517 kubelet_node_status.go:73] "Attempting to register node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.734457 kubelet[2517]: E0325 01:33:11.732601 2517 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.243.75.178:6443/api/v1/nodes\": dial tcp 10.243.75.178:6443: connect: connection refused" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:11.908488 containerd[1528]: time="2025-03-25T01:33:11.908191983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-y0b1r.gb1.brightbox.com,Uid:0982b4988025caeb91f95b38306a6938,Namespace:kube-system,Attempt:0,}" Mar 25 01:33:11.917152 containerd[1528]: time="2025-03-25T01:33:11.917103673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-y0b1r.gb1.brightbox.com,Uid:7f4b9fe202ba7f7a3ab1ae3d4cd08969,Namespace:kube-system,Attempt:0,}" Mar 25 01:33:11.920000 containerd[1528]: time="2025-03-25T01:33:11.919737552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-y0b1r.gb1.brightbox.com,Uid:fb9eb9823e97d6c912ef3c7d0b97e2d8,Namespace:kube-system,Attempt:0,}" Mar 25 01:33:12.037087 kubelet[2517]: E0325 01:33:12.037002 2517 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.75.178:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-y0b1r.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.75.178:6443: connect: connection refused" interval="800ms" Mar 25 01:33:12.136228 kubelet[2517]: I0325 01:33:12.136169 2517 kubelet_node_status.go:73] "Attempting to register node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:12.136963 kubelet[2517]: E0325 01:33:12.136828 2517 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.243.75.178:6443/api/v1/nodes\": dial tcp 10.243.75.178:6443: connect: connection refused" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:12.578619 sshd[2320]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:12.592141 kubelet[2517]: W0325 01:33:12.592025 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.243.75.178:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-y0b1r.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:12.592141 kubelet[2517]: E0325 01:33:12.592108 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.243.75.178:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-y0b1r.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:12.617288 kubelet[2517]: W0325 01:33:12.617165 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.243.75.178:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:12.617288 kubelet[2517]: E0325 01:33:12.617244 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.243.75.178:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:12.832529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2278335831.mount: Deactivated successfully. Mar 25 01:33:12.838762 kubelet[2517]: E0325 01:33:12.838684 2517 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.243.75.178:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-y0b1r.gb1.brightbox.com?timeout=10s\": dial tcp 10.243.75.178:6443: connect: connection refused" interval="1.6s" Mar 25 01:33:12.839638 containerd[1528]: time="2025-03-25T01:33:12.839530739Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:33:12.845324 containerd[1528]: time="2025-03-25T01:33:12.845178645Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 25 01:33:12.845970 containerd[1528]: time="2025-03-25T01:33:12.845922761Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:33:12.846869 containerd[1528]: time="2025-03-25T01:33:12.846784268Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:33:12.848334 containerd[1528]: time="2025-03-25T01:33:12.848299544Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:33:12.849611 containerd[1528]: time="2025-03-25T01:33:12.849550147Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:33:12.851086 containerd[1528]: time="2025-03-25T01:33:12.850916436Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:33:12.852220 containerd[1528]: time="2025-03-25T01:33:12.852137996Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:33:12.854602 containerd[1528]: time="2025-03-25T01:33:12.853268911Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 924.575963ms" Mar 25 01:33:12.856110 containerd[1528]: time="2025-03-25T01:33:12.856068736Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 936.978152ms" Mar 25 01:33:12.868829 containerd[1528]: time="2025-03-25T01:33:12.868407189Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 955.673782ms" Mar 25 01:33:12.942132 kubelet[2517]: I0325 01:33:12.942090 2517 kubelet_node_status.go:73] "Attempting to register node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:12.943243 kubelet[2517]: E0325 01:33:12.943163 2517 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.243.75.178:6443/api/v1/nodes\": dial tcp 10.243.75.178:6443: connect: connection refused" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:12.961938 sshd-session[2552]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:12.999736 containerd[1528]: time="2025-03-25T01:33:12.999138798Z" level=info msg="connecting to shim 75fa79b9d02fb1f96cfd8ec5bb4e02d54c6f57427bc7bdb70ac638e030d691bb" address="unix:///run/containerd/s/58b83888450983ef0b1262adf73b91018b5e727416607e59db7bf8c2ee291272" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:13.006878 containerd[1528]: time="2025-03-25T01:33:13.006283641Z" level=info msg="connecting to shim 47639894eb3390178f516d14b31f909606d84aea736cac7799922997d2e54de6" address="unix:///run/containerd/s/8dd452509a92c1bb474044ff6bcc752588f70db24b42cf31de22724b356c1ed5" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:13.018921 containerd[1528]: time="2025-03-25T01:33:13.018161854Z" level=info msg="connecting to shim c3d6e4168b2f321c26c4afdbcb5cb699e1ad3cd7237b8c656a29d70a902d6895" address="unix:///run/containerd/s/fdd4d41fc5dfeedc2f1b1f7412b1f8fa59f9076202e5e1f73cb8df250da879b9" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:13.026403 kubelet[2517]: W0325 01:33:13.026201 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.243.75.178:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:13.026403 kubelet[2517]: E0325 01:33:13.026335 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.243.75.178:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:13.036336 kubelet[2517]: W0325 01:33:13.036238 2517 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.243.75.178:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:13.036690 kubelet[2517]: E0325 01:33:13.036639 2517 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.243.75.178:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:13.122765 systemd[1]: Started cri-containerd-47639894eb3390178f516d14b31f909606d84aea736cac7799922997d2e54de6.scope - libcontainer container 47639894eb3390178f516d14b31f909606d84aea736cac7799922997d2e54de6. Mar 25 01:33:13.127871 systemd[1]: Started cri-containerd-75fa79b9d02fb1f96cfd8ec5bb4e02d54c6f57427bc7bdb70ac638e030d691bb.scope - libcontainer container 75fa79b9d02fb1f96cfd8ec5bb4e02d54c6f57427bc7bdb70ac638e030d691bb. Mar 25 01:33:13.132277 systemd[1]: Started cri-containerd-c3d6e4168b2f321c26c4afdbcb5cb699e1ad3cd7237b8c656a29d70a902d6895.scope - libcontainer container c3d6e4168b2f321c26c4afdbcb5cb699e1ad3cd7237b8c656a29d70a902d6895. Mar 25 01:33:13.247698 containerd[1528]: time="2025-03-25T01:33:13.246424247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-y0b1r.gb1.brightbox.com,Uid:0982b4988025caeb91f95b38306a6938,Namespace:kube-system,Attempt:0,} returns sandbox id \"75fa79b9d02fb1f96cfd8ec5bb4e02d54c6f57427bc7bdb70ac638e030d691bb\"" Mar 25 01:33:13.259328 containerd[1528]: time="2025-03-25T01:33:13.259268345Z" level=info msg="CreateContainer within sandbox \"75fa79b9d02fb1f96cfd8ec5bb4e02d54c6f57427bc7bdb70ac638e030d691bb\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:33:13.275182 containerd[1528]: time="2025-03-25T01:33:13.275124054Z" level=info msg="Container a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:13.276802 containerd[1528]: time="2025-03-25T01:33:13.276696648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-y0b1r.gb1.brightbox.com,Uid:7f4b9fe202ba7f7a3ab1ae3d4cd08969,Namespace:kube-system,Attempt:0,} returns sandbox id \"47639894eb3390178f516d14b31f909606d84aea736cac7799922997d2e54de6\"" Mar 25 01:33:13.281466 containerd[1528]: time="2025-03-25T01:33:13.281130163Z" level=info msg="CreateContainer within sandbox \"47639894eb3390178f516d14b31f909606d84aea736cac7799922997d2e54de6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:33:13.291912 containerd[1528]: time="2025-03-25T01:33:13.291774126Z" level=info msg="CreateContainer within sandbox \"75fa79b9d02fb1f96cfd8ec5bb4e02d54c6f57427bc7bdb70ac638e030d691bb\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958\"" Mar 25 01:33:13.295979 containerd[1528]: time="2025-03-25T01:33:13.294987618Z" level=info msg="Container 74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:13.312378 containerd[1528]: time="2025-03-25T01:33:13.312288863Z" level=info msg="StartContainer for \"a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958\"" Mar 25 01:33:13.314993 containerd[1528]: time="2025-03-25T01:33:13.314930152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-y0b1r.gb1.brightbox.com,Uid:fb9eb9823e97d6c912ef3c7d0b97e2d8,Namespace:kube-system,Attempt:0,} returns sandbox id \"c3d6e4168b2f321c26c4afdbcb5cb699e1ad3cd7237b8c656a29d70a902d6895\"" Mar 25 01:33:13.315223 containerd[1528]: time="2025-03-25T01:33:13.315190822Z" level=info msg="connecting to shim a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958" address="unix:///run/containerd/s/58b83888450983ef0b1262adf73b91018b5e727416607e59db7bf8c2ee291272" protocol=ttrpc version=3 Mar 25 01:33:13.320760 containerd[1528]: time="2025-03-25T01:33:13.320710012Z" level=info msg="CreateContainer within sandbox \"c3d6e4168b2f321c26c4afdbcb5cb699e1ad3cd7237b8c656a29d70a902d6895\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:33:13.342082 containerd[1528]: time="2025-03-25T01:33:13.341758669Z" level=info msg="CreateContainer within sandbox \"47639894eb3390178f516d14b31f909606d84aea736cac7799922997d2e54de6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a\"" Mar 25 01:33:13.343566 containerd[1528]: time="2025-03-25T01:33:13.343528836Z" level=info msg="StartContainer for \"74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a\"" Mar 25 01:33:13.346300 containerd[1528]: time="2025-03-25T01:33:13.346267590Z" level=info msg="connecting to shim 74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a" address="unix:///run/containerd/s/8dd452509a92c1bb474044ff6bcc752588f70db24b42cf31de22724b356c1ed5" protocol=ttrpc version=3 Mar 25 01:33:13.350096 systemd[1]: Started cri-containerd-a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958.scope - libcontainer container a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958. Mar 25 01:33:13.354657 containerd[1528]: time="2025-03-25T01:33:13.353885584Z" level=info msg="Container 2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:13.377284 containerd[1528]: time="2025-03-25T01:33:13.375528458Z" level=info msg="CreateContainer within sandbox \"c3d6e4168b2f321c26c4afdbcb5cb699e1ad3cd7237b8c656a29d70a902d6895\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3\"" Mar 25 01:33:13.382855 containerd[1528]: time="2025-03-25T01:33:13.382791804Z" level=info msg="StartContainer for \"2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3\"" Mar 25 01:33:13.385703 containerd[1528]: time="2025-03-25T01:33:13.385658859Z" level=info msg="connecting to shim 2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3" address="unix:///run/containerd/s/fdd4d41fc5dfeedc2f1b1f7412b1f8fa59f9076202e5e1f73cb8df250da879b9" protocol=ttrpc version=3 Mar 25 01:33:13.414849 systemd[1]: Started cri-containerd-74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a.scope - libcontainer container 74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a. Mar 25 01:33:13.419056 kubelet[2517]: E0325 01:33:13.418821 2517 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.243.75.178:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.243.75.178:6443: connect: connection refused Mar 25 01:33:13.433791 systemd[1]: Started cri-containerd-2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3.scope - libcontainer container 2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3. Mar 25 01:33:13.489636 containerd[1528]: time="2025-03-25T01:33:13.489525523Z" level=info msg="StartContainer for \"a9e0d5a9eefece240c37484cb836e964897a2357538acfb07cf28eeb176ef958\" returns successfully" Mar 25 01:33:13.545640 sshd[2297]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:13.554361 containerd[1528]: time="2025-03-25T01:33:13.554281071Z" level=info msg="StartContainer for \"74a1792ac2920cb40df552bb49d93b5568d78c286cfe0f58499f59371ef9837a\" returns successfully" Mar 25 01:33:13.597738 containerd[1528]: time="2025-03-25T01:33:13.597581538Z" level=info msg="StartContainer for \"2c8d66b34d4c5eafd49dc313edaf0ade35606315ff3d10c8d7df92549a9774e3\" returns successfully" Mar 25 01:33:13.744523 sshd[2297]: Received disconnect from 218.92.0.221 port 46308:11: [preauth] Mar 25 01:33:13.744523 sshd[2297]: Disconnected from authenticating user root 218.92.0.221 port 46308 [preauth] Mar 25 01:33:13.747231 systemd[1]: sshd@10-10.243.75.178:22-218.92.0.221:46308.service: Deactivated successfully. Mar 25 01:33:14.546536 kubelet[2517]: I0325 01:33:14.546182 2517 kubelet_node_status.go:73] "Attempting to register node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:14.987816 sshd[2320]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:15.187495 sshd[2320]: Received disconnect from 218.92.0.228 port 33582:11: [preauth] Mar 25 01:33:15.187495 sshd[2320]: Disconnected from authenticating user root 218.92.0.228 port 33582 [preauth] Mar 25 01:33:15.191385 systemd[1]: sshd@11-10.243.75.178:22-218.92.0.228:33582.service: Deactivated successfully. Mar 25 01:33:15.431547 systemd[1]: Started sshd@12-10.243.75.178:22-218.92.0.228:52578.service - OpenSSH per-connection server daemon (218.92.0.228:52578). Mar 25 01:33:16.845839 kubelet[2517]: E0325 01:33:16.845716 2517 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-y0b1r.gb1.brightbox.com\" not found" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:16.892075 kubelet[2517]: I0325 01:33:16.889373 2517 kubelet_node_status.go:76] "Successfully registered node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:17.136648 sshd-session[2799]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:17.389637 kubelet[2517]: I0325 01:33:17.389346 2517 apiserver.go:52] "Watching apiserver" Mar 25 01:33:17.434651 kubelet[2517]: I0325 01:33:17.434491 2517 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:33:18.951486 systemd[1]: Started sshd@13-10.243.75.178:22-218.92.0.221:39028.service - OpenSSH per-connection server daemon (218.92.0.221:39028). Mar 25 01:33:18.974978 systemd[1]: Reload requested from client PID 2803 ('systemctl') (unit session-11.scope)... Mar 25 01:33:18.975025 systemd[1]: Reloading... Mar 25 01:33:19.042572 sshd[2797]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:19.119501 zram_generator::config[2853]: No configuration found. Mar 25 01:33:19.344474 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:33:19.498344 sshd-session[2906]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:19.526085 systemd[1]: Reloading finished in 550 ms. Mar 25 01:33:19.567398 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:33:19.582181 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:33:19.582713 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:19.582807 systemd[1]: kubelet.service: Consumed 1.451s CPU time, 113.9M memory peak. Mar 25 01:33:19.586389 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:33:19.816295 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:33:19.826171 (kubelet)[2918]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:33:19.948180 kubelet[2918]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:33:19.949816 kubelet[2918]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:33:19.949816 kubelet[2918]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:33:19.949816 kubelet[2918]: I0325 01:33:19.948904 2918 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:33:19.956763 kubelet[2918]: I0325 01:33:19.955832 2918 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:33:19.956763 kubelet[2918]: I0325 01:33:19.955863 2918 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:33:19.956763 kubelet[2918]: I0325 01:33:19.956151 2918 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:33:19.959458 kubelet[2918]: I0325 01:33:19.958406 2918 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:33:19.961189 kubelet[2918]: I0325 01:33:19.961005 2918 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:33:19.986465 kubelet[2918]: I0325 01:33:19.986397 2918 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:33:19.988174 kubelet[2918]: I0325 01:33:19.987281 2918 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:33:19.988174 kubelet[2918]: I0325 01:33:19.987339 2918 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-y0b1r.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:33:19.988174 kubelet[2918]: I0325 01:33:19.987645 2918 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:33:19.988174 kubelet[2918]: I0325 01:33:19.987664 2918 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:33:19.990048 kubelet[2918]: I0325 01:33:19.989162 2918 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:33:19.992044 kubelet[2918]: I0325 01:33:19.991534 2918 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:33:19.992044 kubelet[2918]: I0325 01:33:19.991580 2918 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:33:19.992044 kubelet[2918]: I0325 01:33:19.991629 2918 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:33:19.992044 kubelet[2918]: I0325 01:33:19.991657 2918 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:33:20.006476 kubelet[2918]: I0325 01:33:20.004862 2918 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:33:20.006476 kubelet[2918]: I0325 01:33:20.005166 2918 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:33:20.011303 kubelet[2918]: I0325 01:33:20.009628 2918 server.go:1264] "Started kubelet" Mar 25 01:33:20.022303 kubelet[2918]: I0325 01:33:20.021640 2918 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:33:20.040144 kubelet[2918]: I0325 01:33:20.039605 2918 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:33:20.044962 kubelet[2918]: I0325 01:33:20.044742 2918 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:33:20.046004 kubelet[2918]: I0325 01:33:20.045411 2918 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:33:20.048927 kubelet[2918]: I0325 01:33:20.048895 2918 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:33:20.051310 kubelet[2918]: I0325 01:33:20.051258 2918 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:33:20.052789 kubelet[2918]: I0325 01:33:20.052704 2918 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:33:20.054948 kubelet[2918]: I0325 01:33:20.054646 2918 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:33:20.071291 kubelet[2918]: I0325 01:33:20.071077 2918 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:33:20.071291 kubelet[2918]: I0325 01:33:20.071226 2918 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:33:20.083305 kubelet[2918]: E0325 01:33:20.083238 2918 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:33:20.084357 kubelet[2918]: I0325 01:33:20.084158 2918 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:33:20.086495 kubelet[2918]: I0325 01:33:20.086426 2918 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:33:20.095964 kubelet[2918]: I0325 01:33:20.095665 2918 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:33:20.095964 kubelet[2918]: I0325 01:33:20.095743 2918 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:33:20.095964 kubelet[2918]: I0325 01:33:20.095793 2918 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:33:20.095964 kubelet[2918]: E0325 01:33:20.095876 2918 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:33:20.192100 kubelet[2918]: I0325 01:33:20.191361 2918 kubelet_node_status.go:73] "Attempting to register node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.196029 kubelet[2918]: E0325 01:33:20.195999 2918 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 25 01:33:20.224891 kubelet[2918]: I0325 01:33:20.224789 2918 kubelet_node_status.go:112] "Node was previously registered" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.226162 kubelet[2918]: I0325 01:33:20.226076 2918 kubelet_node_status.go:76] "Successfully registered node" node="srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.287341 kubelet[2918]: I0325 01:33:20.287286 2918 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:33:20.288258 kubelet[2918]: I0325 01:33:20.287326 2918 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:33:20.288258 kubelet[2918]: I0325 01:33:20.287534 2918 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:33:20.289771 kubelet[2918]: I0325 01:33:20.289696 2918 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:33:20.289867 kubelet[2918]: I0325 01:33:20.289731 2918 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:33:20.289959 kubelet[2918]: I0325 01:33:20.289869 2918 policy_none.go:49] "None policy: Start" Mar 25 01:33:20.295169 kubelet[2918]: I0325 01:33:20.294724 2918 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:33:20.295169 kubelet[2918]: I0325 01:33:20.294809 2918 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:33:20.296030 kubelet[2918]: I0325 01:33:20.295593 2918 state_mem.go:75] "Updated machine memory state" Mar 25 01:33:20.313371 kubelet[2918]: I0325 01:33:20.313228 2918 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:33:20.320163 kubelet[2918]: I0325 01:33:20.315301 2918 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:33:20.320163 kubelet[2918]: I0325 01:33:20.316985 2918 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:33:20.397039 kubelet[2918]: I0325 01:33:20.396667 2918 topology_manager.go:215] "Topology Admit Handler" podUID="7f4b9fe202ba7f7a3ab1ae3d4cd08969" podNamespace="kube-system" podName="kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.398211 kubelet[2918]: I0325 01:33:20.398166 2918 topology_manager.go:215] "Topology Admit Handler" podUID="fb9eb9823e97d6c912ef3c7d0b97e2d8" podNamespace="kube-system" podName="kube-scheduler-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.398836 kubelet[2918]: I0325 01:33:20.398811 2918 topology_manager.go:215] "Topology Admit Handler" podUID="0982b4988025caeb91f95b38306a6938" podNamespace="kube-system" podName="kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.417513 kubelet[2918]: W0325 01:33:20.417453 2918 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:33:20.420163 kubelet[2918]: W0325 01:33:20.418922 2918 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:33:20.420504 kubelet[2918]: W0325 01:33:20.419078 2918 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:33:20.458204 kubelet[2918]: I0325 01:33:20.458107 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-k8s-certs\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458493 kubelet[2918]: I0325 01:33:20.458225 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458493 kubelet[2918]: I0325 01:33:20.458327 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0982b4988025caeb91f95b38306a6938-ca-certs\") pod \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" (UID: \"0982b4988025caeb91f95b38306a6938\") " pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458493 kubelet[2918]: I0325 01:33:20.458371 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0982b4988025caeb91f95b38306a6938-k8s-certs\") pod \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" (UID: \"0982b4988025caeb91f95b38306a6938\") " pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458493 kubelet[2918]: I0325 01:33:20.458411 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0982b4988025caeb91f95b38306a6938-usr-share-ca-certificates\") pod \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" (UID: \"0982b4988025caeb91f95b38306a6938\") " pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458493 kubelet[2918]: I0325 01:33:20.458481 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-ca-certs\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458769 kubelet[2918]: I0325 01:33:20.458529 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-flexvolume-dir\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458769 kubelet[2918]: I0325 01:33:20.458558 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7f4b9fe202ba7f7a3ab1ae3d4cd08969-kubeconfig\") pod \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" (UID: \"7f4b9fe202ba7f7a3ab1ae3d4cd08969\") " pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.458769 kubelet[2918]: I0325 01:33:20.458599 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fb9eb9823e97d6c912ef3c7d0b97e2d8-kubeconfig\") pod \"kube-scheduler-srv-y0b1r.gb1.brightbox.com\" (UID: \"fb9eb9823e97d6c912ef3c7d0b97e2d8\") " pod="kube-system/kube-scheduler-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:20.997811 kubelet[2918]: I0325 01:33:20.997322 2918 apiserver.go:52] "Watching apiserver" Mar 25 01:33:21.054829 kubelet[2918]: I0325 01:33:21.054676 2918 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:33:21.085334 kubelet[2918]: I0325 01:33:21.083248 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-y0b1r.gb1.brightbox.com" podStartSLOduration=1.083188114 podStartE2EDuration="1.083188114s" podCreationTimestamp="2025-03-25 01:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:33:21.048383477 +0000 UTC m=+1.200993651" watchObservedRunningTime="2025-03-25 01:33:21.083188114 +0000 UTC m=+1.235798276" Mar 25 01:33:21.117177 kubelet[2918]: I0325 01:33:21.110251 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" podStartSLOduration=1.110217151 podStartE2EDuration="1.110217151s" podCreationTimestamp="2025-03-25 01:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:33:21.085127389 +0000 UTC m=+1.237737575" watchObservedRunningTime="2025-03-25 01:33:21.110217151 +0000 UTC m=+1.262827324" Mar 25 01:33:21.229491 kubelet[2918]: W0325 01:33:21.228118 2918 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:33:21.229491 kubelet[2918]: W0325 01:33:21.228219 2918 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 01:33:21.229491 kubelet[2918]: E0325 01:33:21.228298 2918 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-srv-y0b1r.gb1.brightbox.com\" already exists" pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:21.233227 kubelet[2918]: E0325 01:33:21.231976 2918 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-y0b1r.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-y0b1r.gb1.brightbox.com" Mar 25 01:33:21.270643 kubelet[2918]: I0325 01:33:21.270402 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-y0b1r.gb1.brightbox.com" podStartSLOduration=1.270371801 podStartE2EDuration="1.270371801s" podCreationTimestamp="2025-03-25 01:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:33:21.121034766 +0000 UTC m=+1.273644914" watchObservedRunningTime="2025-03-25 01:33:21.270371801 +0000 UTC m=+1.422981980" Mar 25 01:33:21.815764 sshd[2797]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:22.286214 sshd-session[2957]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.228 user=root Mar 25 01:33:24.682915 sshd[2797]: PAM: Permission denied for root from 218.92.0.228 Mar 25 01:33:24.919569 sshd[2797]: Received disconnect from 218.92.0.228 port 52578:11: [preauth] Mar 25 01:33:24.919569 sshd[2797]: Disconnected from authenticating user root 218.92.0.228 port 52578 [preauth] Mar 25 01:33:24.923856 systemd[1]: sshd@12-10.243.75.178:22-218.92.0.228:52578.service: Deactivated successfully. Mar 25 01:33:26.018176 sudo[1816]: pam_unix(sudo:session): session closed for user root Mar 25 01:33:26.163556 sshd[1815]: Connection closed by 139.178.68.195 port 38476 Mar 25 01:33:26.165124 sshd-session[1813]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:26.208893 systemd[1]: sshd@8-10.243.75.178:22-139.178.68.195:38476.service: Deactivated successfully. Mar 25 01:33:26.211358 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:33:26.211708 systemd[1]: session-11.scope: Consumed 5.928s CPU time, 183.9M memory peak. Mar 25 01:33:26.216121 systemd-logind[1504]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:33:26.222938 systemd-logind[1504]: Removed session 11. Mar 25 01:33:29.449263 systemd[1]: Started sshd@14-10.243.75.178:22-218.92.0.221:45474.service - OpenSSH per-connection server daemon (218.92.0.221:45474). Mar 25 01:33:31.144960 sshd-session[3000]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:33.305666 sshd[2998]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:33.765004 sshd-session[3001]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:34.440781 kubelet[2918]: I0325 01:33:34.440722 2918 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:33:34.442518 containerd[1528]: time="2025-03-25T01:33:34.441911209Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:33:34.442994 kubelet[2918]: I0325 01:33:34.442223 2918 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:33:34.507270 kubelet[2918]: I0325 01:33:34.504911 2918 topology_manager.go:215] "Topology Admit Handler" podUID="ec0a0268-10a5-4cf3-8d97-ad67b6086fab" podNamespace="kube-system" podName="kube-proxy-9dgv5" Mar 25 01:33:34.521350 systemd[1]: Created slice kubepods-besteffort-podec0a0268_10a5_4cf3_8d97_ad67b6086fab.slice - libcontainer container kubepods-besteffort-podec0a0268_10a5_4cf3_8d97_ad67b6086fab.slice. Mar 25 01:33:34.532552 kubelet[2918]: W0325 01:33:34.532488 2918 reflector.go:547] object-"kube-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-y0b1r.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-y0b1r.gb1.brightbox.com' and this object Mar 25 01:33:34.532844 kubelet[2918]: E0325 01:33:34.532559 2918 reflector.go:150] object-"kube-system"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-y0b1r.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-y0b1r.gb1.brightbox.com' and this object Mar 25 01:33:34.532844 kubelet[2918]: W0325 01:33:34.532661 2918 reflector.go:547] object-"kube-system"/"kube-proxy": failed to list *v1.ConfigMap: configmaps "kube-proxy" is forbidden: User "system:node:srv-y0b1r.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-y0b1r.gb1.brightbox.com' and this object Mar 25 01:33:34.532844 kubelet[2918]: E0325 01:33:34.532704 2918 reflector.go:150] object-"kube-system"/"kube-proxy": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-proxy" is forbidden: User "system:node:srv-y0b1r.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-y0b1r.gb1.brightbox.com' and this object Mar 25 01:33:34.565623 kubelet[2918]: I0325 01:33:34.565568 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-lib-modules\") pod \"kube-proxy-9dgv5\" (UID: \"ec0a0268-10a5-4cf3-8d97-ad67b6086fab\") " pod="kube-system/kube-proxy-9dgv5" Mar 25 01:33:34.565982 kubelet[2918]: I0325 01:33:34.565843 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qznmd\" (UniqueName: \"kubernetes.io/projected/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-kube-api-access-qznmd\") pod \"kube-proxy-9dgv5\" (UID: \"ec0a0268-10a5-4cf3-8d97-ad67b6086fab\") " pod="kube-system/kube-proxy-9dgv5" Mar 25 01:33:34.565982 kubelet[2918]: I0325 01:33:34.565886 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-xtables-lock\") pod \"kube-proxy-9dgv5\" (UID: \"ec0a0268-10a5-4cf3-8d97-ad67b6086fab\") " pod="kube-system/kube-proxy-9dgv5" Mar 25 01:33:34.565982 kubelet[2918]: I0325 01:33:34.565915 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-kube-proxy\") pod \"kube-proxy-9dgv5\" (UID: \"ec0a0268-10a5-4cf3-8d97-ad67b6086fab\") " pod="kube-system/kube-proxy-9dgv5" Mar 25 01:33:34.719620 kubelet[2918]: I0325 01:33:34.719443 2918 topology_manager.go:215] "Topology Admit Handler" podUID="ef5f22b1-7482-4c24-b84e-dc736e3dbdac" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-dnn4z" Mar 25 01:33:34.730690 systemd[1]: Created slice kubepods-besteffort-podef5f22b1_7482_4c24_b84e_dc736e3dbdac.slice - libcontainer container kubepods-besteffort-podef5f22b1_7482_4c24_b84e_dc736e3dbdac.slice. Mar 25 01:33:34.768122 kubelet[2918]: I0325 01:33:34.768056 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg7b5\" (UniqueName: \"kubernetes.io/projected/ef5f22b1-7482-4c24-b84e-dc736e3dbdac-kube-api-access-dg7b5\") pod \"tigera-operator-6479d6dc54-dnn4z\" (UID: \"ef5f22b1-7482-4c24-b84e-dc736e3dbdac\") " pod="tigera-operator/tigera-operator-6479d6dc54-dnn4z" Mar 25 01:33:34.768278 kubelet[2918]: I0325 01:33:34.768145 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ef5f22b1-7482-4c24-b84e-dc736e3dbdac-var-lib-calico\") pod \"tigera-operator-6479d6dc54-dnn4z\" (UID: \"ef5f22b1-7482-4c24-b84e-dc736e3dbdac\") " pod="tigera-operator/tigera-operator-6479d6dc54-dnn4z" Mar 25 01:33:35.037255 containerd[1528]: time="2025-03-25T01:33:35.037046735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-dnn4z,Uid:ef5f22b1-7482-4c24-b84e-dc736e3dbdac,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:33:35.064393 containerd[1528]: time="2025-03-25T01:33:35.064317897Z" level=info msg="connecting to shim 47a020ec71a7b28e6aa79b8693b446a179f4c0bb1ff11fce385848377b0ad87a" address="unix:///run/containerd/s/5e929d54e90ad41ed27e2ada3056894158479870c0eabe03dee4f48035706779" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:35.109642 systemd[1]: Started cri-containerd-47a020ec71a7b28e6aa79b8693b446a179f4c0bb1ff11fce385848377b0ad87a.scope - libcontainer container 47a020ec71a7b28e6aa79b8693b446a179f4c0bb1ff11fce385848377b0ad87a. Mar 25 01:33:35.185276 containerd[1528]: time="2025-03-25T01:33:35.185115231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-dnn4z,Uid:ef5f22b1-7482-4c24-b84e-dc736e3dbdac,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"47a020ec71a7b28e6aa79b8693b446a179f4c0bb1ff11fce385848377b0ad87a\"" Mar 25 01:33:35.187594 containerd[1528]: time="2025-03-25T01:33:35.187554393Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:33:35.669090 kubelet[2918]: E0325 01:33:35.668985 2918 configmap.go:199] Couldn't get configMap kube-system/kube-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:33:35.669819 kubelet[2918]: E0325 01:33:35.669188 2918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-kube-proxy podName:ec0a0268-10a5-4cf3-8d97-ad67b6086fab nodeName:}" failed. No retries permitted until 2025-03-25 01:33:36.169138628 +0000 UTC m=+16.321748776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-proxy" (UniqueName: "kubernetes.io/configmap/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-kube-proxy") pod "kube-proxy-9dgv5" (UID: "ec0a0268-10a5-4cf3-8d97-ad67b6086fab") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:33:35.727983 kubelet[2918]: E0325 01:33:35.727872 2918 projected.go:294] Couldn't get configMap kube-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:33:35.727983 kubelet[2918]: E0325 01:33:35.727947 2918 projected.go:200] Error preparing data for projected volume kube-api-access-qznmd for pod kube-system/kube-proxy-9dgv5: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:33:35.728239 kubelet[2918]: E0325 01:33:35.728053 2918 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-kube-api-access-qznmd podName:ec0a0268-10a5-4cf3-8d97-ad67b6086fab nodeName:}" failed. No retries permitted until 2025-03-25 01:33:36.22803021 +0000 UTC m=+16.380640360 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qznmd" (UniqueName: "kubernetes.io/projected/ec0a0268-10a5-4cf3-8d97-ad67b6086fab-kube-api-access-qznmd") pod "kube-proxy-9dgv5" (UID: "ec0a0268-10a5-4cf3-8d97-ad67b6086fab") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:33:36.006130 sshd[2998]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:36.332360 containerd[1528]: time="2025-03-25T01:33:36.332034875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9dgv5,Uid:ec0a0268-10a5-4cf3-8d97-ad67b6086fab,Namespace:kube-system,Attempt:0,}" Mar 25 01:33:36.355899 containerd[1528]: time="2025-03-25T01:33:36.355843164Z" level=info msg="connecting to shim 31db5c103750f0eb418e8a039913eb4d1f28e65b280d27301ef5f2d2f316be20" address="unix:///run/containerd/s/a7ebc803182e630facc57a914f3c287f0302ffcf95d0ac49690e8f5d769bfdef" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:36.391717 systemd[1]: Started cri-containerd-31db5c103750f0eb418e8a039913eb4d1f28e65b280d27301ef5f2d2f316be20.scope - libcontainer container 31db5c103750f0eb418e8a039913eb4d1f28e65b280d27301ef5f2d2f316be20. Mar 25 01:33:36.431318 containerd[1528]: time="2025-03-25T01:33:36.431273322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9dgv5,Uid:ec0a0268-10a5-4cf3-8d97-ad67b6086fab,Namespace:kube-system,Attempt:0,} returns sandbox id \"31db5c103750f0eb418e8a039913eb4d1f28e65b280d27301ef5f2d2f316be20\"" Mar 25 01:33:36.436907 containerd[1528]: time="2025-03-25T01:33:36.436873476Z" level=info msg="CreateContainer within sandbox \"31db5c103750f0eb418e8a039913eb4d1f28e65b280d27301ef5f2d2f316be20\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:33:36.465572 containerd[1528]: time="2025-03-25T01:33:36.465500876Z" level=info msg="Container d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:36.467357 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2369223349.mount: Deactivated successfully. Mar 25 01:33:36.476645 containerd[1528]: time="2025-03-25T01:33:36.476535084Z" level=info msg="CreateContainer within sandbox \"31db5c103750f0eb418e8a039913eb4d1f28e65b280d27301ef5f2d2f316be20\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9\"" Mar 25 01:33:36.477897 containerd[1528]: time="2025-03-25T01:33:36.477866564Z" level=info msg="StartContainer for \"d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9\"" Mar 25 01:33:36.484760 containerd[1528]: time="2025-03-25T01:33:36.484236607Z" level=info msg="connecting to shim d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9" address="unix:///run/containerd/s/a7ebc803182e630facc57a914f3c287f0302ffcf95d0ac49690e8f5d769bfdef" protocol=ttrpc version=3 Mar 25 01:33:36.489628 sshd-session[3047]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:36.513652 systemd[1]: Started cri-containerd-d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9.scope - libcontainer container d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9. Mar 25 01:33:36.574475 containerd[1528]: time="2025-03-25T01:33:36.574358198Z" level=info msg="StartContainer for \"d7bddb99e7cfb1f985a171f66aae7a0f48da54bfa7927eed1db1aa9b35a1dce9\" returns successfully" Mar 25 01:33:37.247253 kubelet[2918]: I0325 01:33:37.247087 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9dgv5" podStartSLOduration=3.247036633 podStartE2EDuration="3.247036633s" podCreationTimestamp="2025-03-25 01:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:33:37.245808773 +0000 UTC m=+17.398418950" watchObservedRunningTime="2025-03-25 01:33:37.247036633 +0000 UTC m=+17.399646788" Mar 25 01:33:37.448289 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1906248977.mount: Deactivated successfully. Mar 25 01:33:38.428903 containerd[1528]: time="2025-03-25T01:33:38.428814557Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:38.430803 containerd[1528]: time="2025-03-25T01:33:38.430714277Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 25 01:33:38.431470 containerd[1528]: time="2025-03-25T01:33:38.431228159Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:38.433746 containerd[1528]: time="2025-03-25T01:33:38.433690253Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:38.434906 containerd[1528]: time="2025-03-25T01:33:38.434709819Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 3.247100442s" Mar 25 01:33:38.434906 containerd[1528]: time="2025-03-25T01:33:38.434766139Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 25 01:33:38.442482 containerd[1528]: time="2025-03-25T01:33:38.442280375Z" level=info msg="CreateContainer within sandbox \"47a020ec71a7b28e6aa79b8693b446a179f4c0bb1ff11fce385848377b0ad87a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:33:38.453755 containerd[1528]: time="2025-03-25T01:33:38.453294752Z" level=info msg="Container d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:38.463624 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount156485019.mount: Deactivated successfully. Mar 25 01:33:38.474387 sshd[2998]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:38.475374 containerd[1528]: time="2025-03-25T01:33:38.475303618Z" level=info msg="CreateContainer within sandbox \"47a020ec71a7b28e6aa79b8693b446a179f4c0bb1ff11fce385848377b0ad87a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88\"" Mar 25 01:33:38.476119 containerd[1528]: time="2025-03-25T01:33:38.475909360Z" level=info msg="StartContainer for \"d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88\"" Mar 25 01:33:38.477845 containerd[1528]: time="2025-03-25T01:33:38.477799721Z" level=info msg="connecting to shim d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88" address="unix:///run/containerd/s/5e929d54e90ad41ed27e2ada3056894158479870c0eabe03dee4f48035706779" protocol=ttrpc version=3 Mar 25 01:33:38.518762 systemd[1]: Started cri-containerd-d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88.scope - libcontainer container d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88. Mar 25 01:33:38.570946 containerd[1528]: time="2025-03-25T01:33:38.570803925Z" level=info msg="StartContainer for \"d3475d04ded62acff371b4244c67a908ec0370bb1a4cf39fc71f95d85afe3c88\" returns successfully" Mar 25 01:33:38.709981 sshd[2998]: Received disconnect from 218.92.0.221 port 45474:11: [preauth] Mar 25 01:33:38.709981 sshd[2998]: Disconnected from authenticating user root 218.92.0.221 port 45474 [preauth] Mar 25 01:33:38.712728 systemd[1]: sshd@14-10.243.75.178:22-218.92.0.221:45474.service: Deactivated successfully. Mar 25 01:33:38.944660 systemd[1]: Started sshd@15-10.243.75.178:22-218.92.0.221:38464.service - OpenSSH per-connection server daemon (218.92.0.221:38464). Mar 25 01:33:39.270061 kubelet[2918]: I0325 01:33:39.268615 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-dnn4z" podStartSLOduration=2.019272631 podStartE2EDuration="5.268527097s" podCreationTimestamp="2025-03-25 01:33:34 +0000 UTC" firstStartedPulling="2025-03-25 01:33:35.186926118 +0000 UTC m=+15.339536269" lastFinishedPulling="2025-03-25 01:33:38.436180573 +0000 UTC m=+18.588790735" observedRunningTime="2025-03-25 01:33:39.266172039 +0000 UTC m=+19.418782218" watchObservedRunningTime="2025-03-25 01:33:39.268527097 +0000 UTC m=+19.421137283" Mar 25 01:33:41.678541 sshd-session[3294]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:41.956428 kubelet[2918]: I0325 01:33:41.954939 2918 topology_manager.go:215] "Topology Admit Handler" podUID="fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" podNamespace="calico-system" podName="calico-typha-687dd987cb-wwwhk" Mar 25 01:33:41.972456 systemd[1]: Created slice kubepods-besteffort-podfd871a25_7d2c_4fb4_8049_0fd8f241ab9c.slice - libcontainer container kubepods-besteffort-podfd871a25_7d2c_4fb4_8049_0fd8f241ab9c.slice. Mar 25 01:33:42.016713 kubelet[2918]: I0325 01:33:42.016387 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-typha-certs\") pod \"calico-typha-687dd987cb-wwwhk\" (UID: \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\") " pod="calico-system/calico-typha-687dd987cb-wwwhk" Mar 25 01:33:42.016713 kubelet[2918]: I0325 01:33:42.016536 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4x79\" (UniqueName: \"kubernetes.io/projected/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-kube-api-access-p4x79\") pod \"calico-typha-687dd987cb-wwwhk\" (UID: \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\") " pod="calico-system/calico-typha-687dd987cb-wwwhk" Mar 25 01:33:42.016713 kubelet[2918]: I0325 01:33:42.016575 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-tigera-ca-bundle\") pod \"calico-typha-687dd987cb-wwwhk\" (UID: \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\") " pod="calico-system/calico-typha-687dd987cb-wwwhk" Mar 25 01:33:42.095478 kubelet[2918]: I0325 01:33:42.095347 2918 topology_manager.go:215] "Topology Admit Handler" podUID="d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" podNamespace="calico-system" podName="calico-node-xrwrw" Mar 25 01:33:42.113084 systemd[1]: Created slice kubepods-besteffort-podd3b4fc6b_6a13_424a_a46e_4552d4b72a7a.slice - libcontainer container kubepods-besteffort-podd3b4fc6b_6a13_424a_a46e_4552d4b72a7a.slice. Mar 25 01:33:42.117495 kubelet[2918]: I0325 01:33:42.117123 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-lib-modules\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117495 kubelet[2918]: I0325 01:33:42.117170 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-tigera-ca-bundle\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117495 kubelet[2918]: I0325 01:33:42.117202 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-net-dir\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117495 kubelet[2918]: I0325 01:33:42.117252 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-xtables-lock\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117495 kubelet[2918]: I0325 01:33:42.117279 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-flexvol-driver-host\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117839 kubelet[2918]: I0325 01:33:42.117315 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-node-certs\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117839 kubelet[2918]: I0325 01:33:42.117349 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-run-calico\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117839 kubelet[2918]: I0325 01:33:42.117386 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-log-dir\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117839 kubelet[2918]: I0325 01:33:42.117419 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-policysync\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.117839 kubelet[2918]: I0325 01:33:42.117479 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-bin-dir\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.118123 kubelet[2918]: I0325 01:33:42.117526 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-lib-calico\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.118123 kubelet[2918]: I0325 01:33:42.117587 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmxbk\" (UniqueName: \"kubernetes.io/projected/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-kube-api-access-mmxbk\") pod \"calico-node-xrwrw\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " pod="calico-system/calico-node-xrwrw" Mar 25 01:33:42.212561 kubelet[2918]: I0325 01:33:42.211318 2918 topology_manager.go:215] "Topology Admit Handler" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" podNamespace="calico-system" podName="csi-node-driver-zwppf" Mar 25 01:33:42.213350 kubelet[2918]: E0325 01:33:42.213137 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:42.223788 kubelet[2918]: E0325 01:33:42.223595 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.223788 kubelet[2918]: W0325 01:33:42.223634 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.223788 kubelet[2918]: E0325 01:33:42.223698 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.229864 kubelet[2918]: E0325 01:33:42.229699 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.229864 kubelet[2918]: W0325 01:33:42.229776 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.229864 kubelet[2918]: E0325 01:33:42.229803 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.259569 kubelet[2918]: E0325 01:33:42.259532 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.259569 kubelet[2918]: W0325 01:33:42.259560 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.259767 kubelet[2918]: E0325 01:33:42.259584 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.285186 containerd[1528]: time="2025-03-25T01:33:42.284664450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-687dd987cb-wwwhk,Uid:fd871a25-7d2c-4fb4-8049-0fd8f241ab9c,Namespace:calico-system,Attempt:0,}" Mar 25 01:33:42.314592 kubelet[2918]: E0325 01:33:42.314548 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.314592 kubelet[2918]: W0325 01:33:42.314584 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.315025 kubelet[2918]: E0325 01:33:42.314611 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.315161 kubelet[2918]: E0325 01:33:42.315127 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.315161 kubelet[2918]: W0325 01:33:42.315141 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.315161 kubelet[2918]: E0325 01:33:42.315155 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.317429 kubelet[2918]: E0325 01:33:42.317407 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.317429 kubelet[2918]: W0325 01:33:42.317426 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.317662 kubelet[2918]: E0325 01:33:42.317474 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.318242 kubelet[2918]: E0325 01:33:42.318221 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.318242 kubelet[2918]: W0325 01:33:42.318239 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.318379 kubelet[2918]: E0325 01:33:42.318264 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.320692 kubelet[2918]: E0325 01:33:42.320669 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.320692 kubelet[2918]: W0325 01:33:42.320690 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.320835 kubelet[2918]: E0325 01:33:42.320810 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.321388 kubelet[2918]: E0325 01:33:42.321368 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.321388 kubelet[2918]: W0325 01:33:42.321387 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.321615 kubelet[2918]: E0325 01:33:42.321402 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.324555 kubelet[2918]: E0325 01:33:42.324524 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.324555 kubelet[2918]: W0325 01:33:42.324545 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.325556 kubelet[2918]: E0325 01:33:42.324563 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.325556 kubelet[2918]: E0325 01:33:42.324838 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.325556 kubelet[2918]: W0325 01:33:42.324852 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.325556 kubelet[2918]: E0325 01:33:42.324866 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.325556 kubelet[2918]: E0325 01:33:42.325416 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.325556 kubelet[2918]: W0325 01:33:42.325458 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.325556 kubelet[2918]: E0325 01:33:42.325476 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.326402 kubelet[2918]: E0325 01:33:42.326381 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.326402 kubelet[2918]: W0325 01:33:42.326399 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.326539 kubelet[2918]: E0325 01:33:42.326414 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.326705 kubelet[2918]: E0325 01:33:42.326684 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.326705 kubelet[2918]: W0325 01:33:42.326705 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.326837 kubelet[2918]: E0325 01:33:42.326721 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.329280 kubelet[2918]: E0325 01:33:42.329247 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.329280 kubelet[2918]: W0325 01:33:42.329275 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.329419 kubelet[2918]: E0325 01:33:42.329296 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.329597 kubelet[2918]: E0325 01:33:42.329576 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.329597 kubelet[2918]: W0325 01:33:42.329594 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.329735 kubelet[2918]: E0325 01:33:42.329610 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.329852 kubelet[2918]: E0325 01:33:42.329831 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.329852 kubelet[2918]: W0325 01:33:42.329849 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.329949 kubelet[2918]: E0325 01:33:42.329863 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.330117 kubelet[2918]: E0325 01:33:42.330091 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.330117 kubelet[2918]: W0325 01:33:42.330108 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.330579 kubelet[2918]: E0325 01:33:42.330123 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.330805 kubelet[2918]: E0325 01:33:42.330784 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.330805 kubelet[2918]: W0325 01:33:42.330802 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.330908 kubelet[2918]: E0325 01:33:42.330818 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.331577 kubelet[2918]: E0325 01:33:42.331555 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.331577 kubelet[2918]: W0325 01:33:42.331573 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.331959 kubelet[2918]: E0325 01:33:42.331589 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.333011 kubelet[2918]: E0325 01:33:42.332041 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.333011 kubelet[2918]: W0325 01:33:42.332054 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.333011 kubelet[2918]: E0325 01:33:42.332069 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.335905 kubelet[2918]: E0325 01:33:42.334528 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.335905 kubelet[2918]: W0325 01:33:42.334550 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.335905 kubelet[2918]: E0325 01:33:42.334568 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.335905 kubelet[2918]: E0325 01:33:42.334794 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.335905 kubelet[2918]: W0325 01:33:42.334808 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.335905 kubelet[2918]: E0325 01:33:42.334822 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.335905 kubelet[2918]: E0325 01:33:42.335343 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.335905 kubelet[2918]: W0325 01:33:42.335357 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.335905 kubelet[2918]: E0325 01:33:42.335371 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.336394 kubelet[2918]: I0325 01:33:42.335405 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8cfce3bd-710f-418f-a9b3-28030dc77aba-varrun\") pod \"csi-node-driver-zwppf\" (UID: \"8cfce3bd-710f-418f-a9b3-28030dc77aba\") " pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:42.336394 kubelet[2918]: E0325 01:33:42.336110 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.336394 kubelet[2918]: W0325 01:33:42.336125 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.336394 kubelet[2918]: E0325 01:33:42.336159 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.336394 kubelet[2918]: I0325 01:33:42.336187 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cfce3bd-710f-418f-a9b3-28030dc77aba-registration-dir\") pod \"csi-node-driver-zwppf\" (UID: \"8cfce3bd-710f-418f-a9b3-28030dc77aba\") " pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:42.337088 containerd[1528]: time="2025-03-25T01:33:42.336742691Z" level=info msg="connecting to shim 3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b" address="unix:///run/containerd/s/3569dcd4d50efa8639133b4af0b971b773ea8900e1a183e0e7b3489eac525405" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:42.338627 kubelet[2918]: E0325 01:33:42.338511 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.338627 kubelet[2918]: W0325 01:33:42.338530 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.338627 kubelet[2918]: E0325 01:33:42.338565 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.338980 kubelet[2918]: E0325 01:33:42.338822 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.338980 kubelet[2918]: W0325 01:33:42.338835 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.338980 kubelet[2918]: E0325 01:33:42.338927 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.339527 kubelet[2918]: E0325 01:33:42.339144 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.339527 kubelet[2918]: W0325 01:33:42.339163 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.339527 kubelet[2918]: E0325 01:33:42.339226 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.339527 kubelet[2918]: I0325 01:33:42.339264 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69qlp\" (UniqueName: \"kubernetes.io/projected/8cfce3bd-710f-418f-a9b3-28030dc77aba-kube-api-access-69qlp\") pod \"csi-node-driver-zwppf\" (UID: \"8cfce3bd-710f-418f-a9b3-28030dc77aba\") " pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:42.339527 kubelet[2918]: E0325 01:33:42.339419 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.339527 kubelet[2918]: W0325 01:33:42.339494 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.339527 kubelet[2918]: E0325 01:33:42.339529 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.341009 kubelet[2918]: E0325 01:33:42.339779 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.341009 kubelet[2918]: W0325 01:33:42.339792 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.341009 kubelet[2918]: E0325 01:33:42.339822 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.341009 kubelet[2918]: E0325 01:33:42.340073 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.341009 kubelet[2918]: W0325 01:33:42.340085 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.341009 kubelet[2918]: E0325 01:33:42.340115 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.341009 kubelet[2918]: I0325 01:33:42.340149 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cfce3bd-710f-418f-a9b3-28030dc77aba-socket-dir\") pod \"csi-node-driver-zwppf\" (UID: \"8cfce3bd-710f-418f-a9b3-28030dc77aba\") " pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:42.341009 kubelet[2918]: E0325 01:33:42.340417 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.341009 kubelet[2918]: W0325 01:33:42.340473 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.341584 kubelet[2918]: E0325 01:33:42.340508 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.341584 kubelet[2918]: I0325 01:33:42.340532 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8cfce3bd-710f-418f-a9b3-28030dc77aba-kubelet-dir\") pod \"csi-node-driver-zwppf\" (UID: \"8cfce3bd-710f-418f-a9b3-28030dc77aba\") " pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:42.341584 kubelet[2918]: E0325 01:33:42.340831 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.341584 kubelet[2918]: W0325 01:33:42.340847 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.341584 kubelet[2918]: E0325 01:33:42.340869 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.342823 kubelet[2918]: E0325 01:33:42.342751 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.342823 kubelet[2918]: W0325 01:33:42.342772 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.342823 kubelet[2918]: E0325 01:33:42.342795 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.344562 kubelet[2918]: E0325 01:33:42.344508 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.344562 kubelet[2918]: W0325 01:33:42.344529 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.344562 kubelet[2918]: E0325 01:33:42.344550 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.344778 kubelet[2918]: E0325 01:33:42.344756 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.344778 kubelet[2918]: W0325 01:33:42.344768 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.345021 kubelet[2918]: E0325 01:33:42.344881 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.345428 kubelet[2918]: E0325 01:33:42.345393 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.345428 kubelet[2918]: W0325 01:33:42.345413 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.345554 kubelet[2918]: E0325 01:33:42.345428 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.345826 kubelet[2918]: E0325 01:33:42.345801 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.345826 kubelet[2918]: W0325 01:33:42.345821 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.345985 kubelet[2918]: E0325 01:33:42.345836 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.411845 systemd[1]: Started cri-containerd-3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b.scope - libcontainer container 3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b. Mar 25 01:33:42.417737 containerd[1528]: time="2025-03-25T01:33:42.417424081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xrwrw,Uid:d3b4fc6b-6a13-424a-a46e-4552d4b72a7a,Namespace:calico-system,Attempt:0,}" Mar 25 01:33:42.443856 kubelet[2918]: E0325 01:33:42.443703 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.443856 kubelet[2918]: W0325 01:33:42.443733 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.443856 kubelet[2918]: E0325 01:33:42.443776 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.446536 kubelet[2918]: E0325 01:33:42.445446 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.446536 kubelet[2918]: W0325 01:33:42.445465 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.446536 kubelet[2918]: E0325 01:33:42.445489 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.448648 kubelet[2918]: E0325 01:33:42.448609 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.448648 kubelet[2918]: W0325 01:33:42.448644 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.448791 kubelet[2918]: E0325 01:33:42.448669 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.448937 kubelet[2918]: E0325 01:33:42.448915 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.448937 kubelet[2918]: W0325 01:33:42.448934 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.449104 kubelet[2918]: E0325 01:33:42.449076 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.449254 kubelet[2918]: E0325 01:33:42.449234 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.449254 kubelet[2918]: W0325 01:33:42.449252 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.449554 kubelet[2918]: E0325 01:33:42.449422 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.449736 kubelet[2918]: E0325 01:33:42.449713 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.449736 kubelet[2918]: W0325 01:33:42.449732 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.450490 kubelet[2918]: E0325 01:33:42.450466 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.450669 kubelet[2918]: E0325 01:33:42.450646 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.450669 kubelet[2918]: W0325 01:33:42.450665 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.450774 kubelet[2918]: E0325 01:33:42.450686 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.451016 kubelet[2918]: E0325 01:33:42.450994 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.451016 kubelet[2918]: W0325 01:33:42.451013 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.453413 kubelet[2918]: E0325 01:33:42.453376 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.453731 kubelet[2918]: E0325 01:33:42.453710 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.453731 kubelet[2918]: W0325 01:33:42.453729 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.454076 kubelet[2918]: E0325 01:33:42.454045 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.454324 kubelet[2918]: E0325 01:33:42.454292 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.454324 kubelet[2918]: W0325 01:33:42.454311 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.454543 kubelet[2918]: E0325 01:33:42.454468 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.454865 kubelet[2918]: E0325 01:33:42.454838 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.454997 kubelet[2918]: W0325 01:33:42.454868 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.454997 kubelet[2918]: E0325 01:33:42.454902 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.455256 kubelet[2918]: E0325 01:33:42.455233 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.455256 kubelet[2918]: W0325 01:33:42.455254 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.455382 kubelet[2918]: E0325 01:33:42.455278 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.456549 kubelet[2918]: E0325 01:33:42.456526 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.456549 kubelet[2918]: W0325 01:33:42.456545 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.456790 kubelet[2918]: E0325 01:33:42.456660 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.456949 kubelet[2918]: E0325 01:33:42.456836 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.456949 kubelet[2918]: W0325 01:33:42.456853 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.457267 kubelet[2918]: E0325 01:33:42.457215 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.457460 kubelet[2918]: E0325 01:33:42.457405 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.457460 kubelet[2918]: W0325 01:33:42.457424 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.457698 kubelet[2918]: E0325 01:33:42.457653 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.459605 kubelet[2918]: E0325 01:33:42.459581 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.459605 kubelet[2918]: W0325 01:33:42.459601 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.459744 kubelet[2918]: E0325 01:33:42.459707 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.460000 kubelet[2918]: E0325 01:33:42.459875 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.460000 kubelet[2918]: W0325 01:33:42.459894 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.460356 kubelet[2918]: E0325 01:33:42.460321 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.460768 kubelet[2918]: E0325 01:33:42.460579 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.460768 kubelet[2918]: W0325 01:33:42.460598 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.460768 kubelet[2918]: E0325 01:33:42.460737 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.460944 kubelet[2918]: E0325 01:33:42.460824 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.460944 kubelet[2918]: W0325 01:33:42.460836 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.461839 kubelet[2918]: E0325 01:33:42.461466 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.462027 kubelet[2918]: E0325 01:33:42.462007 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.462027 kubelet[2918]: W0325 01:33:42.462025 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.462158 kubelet[2918]: E0325 01:33:42.462132 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.463479 kubelet[2918]: E0325 01:33:42.463034 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.463479 kubelet[2918]: W0325 01:33:42.463053 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.463479 kubelet[2918]: E0325 01:33:42.463088 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.469555 kubelet[2918]: E0325 01:33:42.469528 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.469555 kubelet[2918]: W0325 01:33:42.469548 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.470451 kubelet[2918]: E0325 01:33:42.470390 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.470451 kubelet[2918]: E0325 01:33:42.470408 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.470451 kubelet[2918]: W0325 01:33:42.470423 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.470785 kubelet[2918]: E0325 01:33:42.470530 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.470834 kubelet[2918]: E0325 01:33:42.470788 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.470834 kubelet[2918]: W0325 01:33:42.470802 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.471458 kubelet[2918]: E0325 01:33:42.471420 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.471832 kubelet[2918]: E0325 01:33:42.471631 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.471832 kubelet[2918]: W0325 01:33:42.471652 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.471832 kubelet[2918]: E0325 01:33:42.471680 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.502267 kubelet[2918]: E0325 01:33:42.502207 2918 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:33:42.503011 kubelet[2918]: W0325 01:33:42.502235 2918 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:33:42.503011 kubelet[2918]: E0325 01:33:42.502892 2918 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:33:42.538798 containerd[1528]: time="2025-03-25T01:33:42.538605876Z" level=info msg="connecting to shim ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f" address="unix:///run/containerd/s/3bb73c60dd02c414850fae8312860640b4546291d21e826207ad978c81c16916" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:33:42.587671 systemd[1]: Started cri-containerd-ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f.scope - libcontainer container ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f. Mar 25 01:33:42.679949 containerd[1528]: time="2025-03-25T01:33:42.678887904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xrwrw,Uid:d3b4fc6b-6a13-424a-a46e-4552d4b72a7a,Namespace:calico-system,Attempt:0,} returns sandbox id \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\"" Mar 25 01:33:42.691094 containerd[1528]: time="2025-03-25T01:33:42.689279997Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:33:42.712287 containerd[1528]: time="2025-03-25T01:33:42.712240368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-687dd987cb-wwwhk,Uid:fd871a25-7d2c-4fb4-8049-0fd8f241ab9c,Namespace:calico-system,Attempt:0,} returns sandbox id \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\"" Mar 25 01:33:44.016520 sshd[3289]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:44.098221 kubelet[2918]: E0325 01:33:44.097630 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:44.445929 containerd[1528]: time="2025-03-25T01:33:44.445818089Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:44.448026 containerd[1528]: time="2025-03-25T01:33:44.447821412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 25 01:33:44.449577 containerd[1528]: time="2025-03-25T01:33:44.448582098Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:44.450740 sshd-session[3469]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:44.452062 containerd[1528]: time="2025-03-25T01:33:44.452005108Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:44.453351 containerd[1528]: time="2025-03-25T01:33:44.453010488Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.763676927s" Mar 25 01:33:44.453351 containerd[1528]: time="2025-03-25T01:33:44.453071914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 25 01:33:44.455697 containerd[1528]: time="2025-03-25T01:33:44.454845163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:33:44.456762 containerd[1528]: time="2025-03-25T01:33:44.456726336Z" level=info msg="CreateContainer within sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:33:44.508488 containerd[1528]: time="2025-03-25T01:33:44.507921459Z" level=info msg="Container 3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:44.524895 containerd[1528]: time="2025-03-25T01:33:44.524810982Z" level=info msg="CreateContainer within sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\"" Mar 25 01:33:44.526787 containerd[1528]: time="2025-03-25T01:33:44.526741104Z" level=info msg="StartContainer for \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\"" Mar 25 01:33:44.534495 containerd[1528]: time="2025-03-25T01:33:44.534450702Z" level=info msg="connecting to shim 3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b" address="unix:///run/containerd/s/3bb73c60dd02c414850fae8312860640b4546291d21e826207ad978c81c16916" protocol=ttrpc version=3 Mar 25 01:33:44.572692 systemd[1]: Started cri-containerd-3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b.scope - libcontainer container 3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b. Mar 25 01:33:44.664219 containerd[1528]: time="2025-03-25T01:33:44.662101186Z" level=info msg="StartContainer for \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" returns successfully" Mar 25 01:33:44.687726 systemd[1]: cri-containerd-3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b.scope: Deactivated successfully. Mar 25 01:33:44.730217 containerd[1528]: time="2025-03-25T01:33:44.729905852Z" level=info msg="received exit event container_id:\"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" id:\"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" pid:3483 exited_at:{seconds:1742866424 nanos:690907734}" Mar 25 01:33:44.755608 containerd[1528]: time="2025-03-25T01:33:44.755541004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" id:\"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" pid:3483 exited_at:{seconds:1742866424 nanos:690907734}" Mar 25 01:33:44.782648 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b-rootfs.mount: Deactivated successfully. Mar 25 01:33:46.103034 kubelet[2918]: E0325 01:33:46.100310 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:46.201507 sshd[3289]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:46.650722 sshd-session[3521]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.221 user=root Mar 25 01:33:47.472906 containerd[1528]: time="2025-03-25T01:33:47.472795925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:47.474876 containerd[1528]: time="2025-03-25T01:33:47.473895082Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 25 01:33:47.475561 containerd[1528]: time="2025-03-25T01:33:47.475360674Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:47.479718 containerd[1528]: time="2025-03-25T01:33:47.479662137Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:47.480645 containerd[1528]: time="2025-03-25T01:33:47.480606291Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.025717729s" Mar 25 01:33:47.480731 containerd[1528]: time="2025-03-25T01:33:47.480648846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 25 01:33:47.483181 containerd[1528]: time="2025-03-25T01:33:47.482950014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:33:47.504966 containerd[1528]: time="2025-03-25T01:33:47.504109295Z" level=info msg="CreateContainer within sandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:33:47.516463 containerd[1528]: time="2025-03-25T01:33:47.513048444Z" level=info msg="Container 9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:47.524154 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount781420929.mount: Deactivated successfully. Mar 25 01:33:47.535681 containerd[1528]: time="2025-03-25T01:33:47.535053974Z" level=info msg="CreateContainer within sandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\"" Mar 25 01:33:47.537198 containerd[1528]: time="2025-03-25T01:33:47.536952634Z" level=info msg="StartContainer for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\"" Mar 25 01:33:47.539865 containerd[1528]: time="2025-03-25T01:33:47.539835321Z" level=info msg="connecting to shim 9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e" address="unix:///run/containerd/s/3569dcd4d50efa8639133b4af0b971b773ea8900e1a183e0e7b3489eac525405" protocol=ttrpc version=3 Mar 25 01:33:47.575637 systemd[1]: Started cri-containerd-9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e.scope - libcontainer container 9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e. Mar 25 01:33:47.664795 containerd[1528]: time="2025-03-25T01:33:47.664729676Z" level=info msg="StartContainer for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" returns successfully" Mar 25 01:33:48.097214 kubelet[2918]: E0325 01:33:48.096746 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:48.677315 sshd[3289]: PAM: Permission denied for root from 218.92.0.221 Mar 25 01:33:48.897802 sshd[3289]: Received disconnect from 218.92.0.221 port 38464:11: [preauth] Mar 25 01:33:48.897802 sshd[3289]: Disconnected from authenticating user root 218.92.0.221 port 38464 [preauth] Mar 25 01:33:48.901722 systemd[1]: sshd@15-10.243.75.178:22-218.92.0.221:38464.service: Deactivated successfully. Mar 25 01:33:49.295760 kubelet[2918]: I0325 01:33:49.294403 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:33:50.101551 kubelet[2918]: E0325 01:33:50.098294 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:52.097386 kubelet[2918]: E0325 01:33:52.097282 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:54.064739 containerd[1528]: time="2025-03-25T01:33:54.064635772Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:54.066907 containerd[1528]: time="2025-03-25T01:33:54.066809312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 25 01:33:54.067383 containerd[1528]: time="2025-03-25T01:33:54.067325216Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:54.071065 containerd[1528]: time="2025-03-25T01:33:54.070817879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:33:54.072002 containerd[1528]: time="2025-03-25T01:33:54.071966470Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 6.588970523s" Mar 25 01:33:54.072085 containerd[1528]: time="2025-03-25T01:33:54.072017927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 25 01:33:54.075797 containerd[1528]: time="2025-03-25T01:33:54.075726475Z" level=info msg="CreateContainer within sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:33:54.094792 containerd[1528]: time="2025-03-25T01:33:54.094713186Z" level=info msg="Container 00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:33:54.098116 kubelet[2918]: E0325 01:33:54.098063 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:54.113165 containerd[1528]: time="2025-03-25T01:33:54.113028661Z" level=info msg="CreateContainer within sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\"" Mar 25 01:33:54.114653 containerd[1528]: time="2025-03-25T01:33:54.114603481Z" level=info msg="StartContainer for \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\"" Mar 25 01:33:54.119375 containerd[1528]: time="2025-03-25T01:33:54.119341708Z" level=info msg="connecting to shim 00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908" address="unix:///run/containerd/s/3bb73c60dd02c414850fae8312860640b4546291d21e826207ad978c81c16916" protocol=ttrpc version=3 Mar 25 01:33:54.194739 systemd[1]: Started cri-containerd-00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908.scope - libcontainer container 00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908. Mar 25 01:33:54.273421 containerd[1528]: time="2025-03-25T01:33:54.273257339Z" level=info msg="StartContainer for \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" returns successfully" Mar 25 01:33:54.350989 kubelet[2918]: I0325 01:33:54.349198 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-687dd987cb-wwwhk" podStartSLOduration=8.582445417 podStartE2EDuration="13.34914105s" podCreationTimestamp="2025-03-25 01:33:41 +0000 UTC" firstStartedPulling="2025-03-25 01:33:42.71516682 +0000 UTC m=+22.867776975" lastFinishedPulling="2025-03-25 01:33:47.481862442 +0000 UTC m=+27.634472608" observedRunningTime="2025-03-25 01:33:48.308299958 +0000 UTC m=+28.460910132" watchObservedRunningTime="2025-03-25 01:33:54.34914105 +0000 UTC m=+34.501751207" Mar 25 01:33:55.142880 systemd[1]: cri-containerd-00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908.scope: Deactivated successfully. Mar 25 01:33:55.143351 systemd[1]: cri-containerd-00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908.scope: Consumed 724ms CPU time, 149.9M memory peak, 1.8M read from disk, 154M written to disk. Mar 25 01:33:55.151111 containerd[1528]: time="2025-03-25T01:33:55.150763487Z" level=info msg="received exit event container_id:\"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" id:\"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" pid:3586 exited_at:{seconds:1742866435 nanos:149935782}" Mar 25 01:33:55.154345 containerd[1528]: time="2025-03-25T01:33:55.153472419Z" level=info msg="TaskExit event in podsandbox handler container_id:\"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" id:\"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" pid:3586 exited_at:{seconds:1742866435 nanos:149935782}" Mar 25 01:33:55.197816 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908-rootfs.mount: Deactivated successfully. Mar 25 01:33:55.274998 kubelet[2918]: I0325 01:33:55.274062 2918 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 25 01:33:55.415541 kubelet[2918]: I0325 01:33:55.414963 2918 topology_manager.go:215] "Topology Admit Handler" podUID="19db9a96-b50d-4635-8125-6ec125a3de16" podNamespace="kube-system" podName="coredns-7db6d8ff4d-m5nkq" Mar 25 01:33:55.422281 kubelet[2918]: I0325 01:33:55.421881 2918 topology_manager.go:215] "Topology Admit Handler" podUID="6685e276-da04-413c-850f-cc7ff0494084" podNamespace="calico-system" podName="calico-kube-controllers-849f5dfb9c-6lwwx" Mar 25 01:33:55.432394 kubelet[2918]: I0325 01:33:55.431132 2918 topology_manager.go:215] "Topology Admit Handler" podUID="52021031-372c-4e64-8acc-2655c18d3f55" podNamespace="kube-system" podName="coredns-7db6d8ff4d-5slnx" Mar 25 01:33:55.438353 kubelet[2918]: I0325 01:33:55.437756 2918 topology_manager.go:215] "Topology Admit Handler" podUID="c2d89f5e-3072-423c-9b85-c270e603fd27" podNamespace="calico-apiserver" podName="calico-apiserver-6f45686d89-b47pq" Mar 25 01:33:55.438353 kubelet[2918]: I0325 01:33:55.437945 2918 topology_manager.go:215] "Topology Admit Handler" podUID="d23f16c0-841b-4e22-af15-6a6e81adc9e9" podNamespace="calico-apiserver" podName="calico-apiserver-6f45686d89-b5srm" Mar 25 01:33:55.456832 kubelet[2918]: I0325 01:33:55.456791 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzwn\" (UniqueName: \"kubernetes.io/projected/6685e276-da04-413c-850f-cc7ff0494084-kube-api-access-6kzwn\") pod \"calico-kube-controllers-849f5dfb9c-6lwwx\" (UID: \"6685e276-da04-413c-850f-cc7ff0494084\") " pod="calico-system/calico-kube-controllers-849f5dfb9c-6lwwx" Mar 25 01:33:55.457055 kubelet[2918]: I0325 01:33:55.457028 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d23f16c0-841b-4e22-af15-6a6e81adc9e9-calico-apiserver-certs\") pod \"calico-apiserver-6f45686d89-b5srm\" (UID: \"d23f16c0-841b-4e22-af15-6a6e81adc9e9\") " pod="calico-apiserver/calico-apiserver-6f45686d89-b5srm" Mar 25 01:33:55.457196 kubelet[2918]: I0325 01:33:55.457172 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c2d89f5e-3072-423c-9b85-c270e603fd27-calico-apiserver-certs\") pod \"calico-apiserver-6f45686d89-b47pq\" (UID: \"c2d89f5e-3072-423c-9b85-c270e603fd27\") " pod="calico-apiserver/calico-apiserver-6f45686d89-b47pq" Mar 25 01:33:55.457323 kubelet[2918]: I0325 01:33:55.457300 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7c5\" (UniqueName: \"kubernetes.io/projected/19db9a96-b50d-4635-8125-6ec125a3de16-kube-api-access-vt7c5\") pod \"coredns-7db6d8ff4d-m5nkq\" (UID: \"19db9a96-b50d-4635-8125-6ec125a3de16\") " pod="kube-system/coredns-7db6d8ff4d-m5nkq" Mar 25 01:33:55.458091 kubelet[2918]: I0325 01:33:55.457523 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6685e276-da04-413c-850f-cc7ff0494084-tigera-ca-bundle\") pod \"calico-kube-controllers-849f5dfb9c-6lwwx\" (UID: \"6685e276-da04-413c-850f-cc7ff0494084\") " pod="calico-system/calico-kube-controllers-849f5dfb9c-6lwwx" Mar 25 01:33:55.458091 kubelet[2918]: I0325 01:33:55.457560 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19db9a96-b50d-4635-8125-6ec125a3de16-config-volume\") pod \"coredns-7db6d8ff4d-m5nkq\" (UID: \"19db9a96-b50d-4635-8125-6ec125a3de16\") " pod="kube-system/coredns-7db6d8ff4d-m5nkq" Mar 25 01:33:55.458091 kubelet[2918]: I0325 01:33:55.457598 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k25mg\" (UniqueName: \"kubernetes.io/projected/52021031-372c-4e64-8acc-2655c18d3f55-kube-api-access-k25mg\") pod \"coredns-7db6d8ff4d-5slnx\" (UID: \"52021031-372c-4e64-8acc-2655c18d3f55\") " pod="kube-system/coredns-7db6d8ff4d-5slnx" Mar 25 01:33:55.458091 kubelet[2918]: I0325 01:33:55.457635 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxql6\" (UniqueName: \"kubernetes.io/projected/d23f16c0-841b-4e22-af15-6a6e81adc9e9-kube-api-access-fxql6\") pod \"calico-apiserver-6f45686d89-b5srm\" (UID: \"d23f16c0-841b-4e22-af15-6a6e81adc9e9\") " pod="calico-apiserver/calico-apiserver-6f45686d89-b5srm" Mar 25 01:33:55.458091 kubelet[2918]: I0325 01:33:55.457673 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qgzr\" (UniqueName: \"kubernetes.io/projected/c2d89f5e-3072-423c-9b85-c270e603fd27-kube-api-access-6qgzr\") pod \"calico-apiserver-6f45686d89-b47pq\" (UID: \"c2d89f5e-3072-423c-9b85-c270e603fd27\") " pod="calico-apiserver/calico-apiserver-6f45686d89-b47pq" Mar 25 01:33:55.459474 kubelet[2918]: I0325 01:33:55.457735 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52021031-372c-4e64-8acc-2655c18d3f55-config-volume\") pod \"coredns-7db6d8ff4d-5slnx\" (UID: \"52021031-372c-4e64-8acc-2655c18d3f55\") " pod="kube-system/coredns-7db6d8ff4d-5slnx" Mar 25 01:33:55.470846 systemd[1]: Created slice kubepods-burstable-pod19db9a96_b50d_4635_8125_6ec125a3de16.slice - libcontainer container kubepods-burstable-pod19db9a96_b50d_4635_8125_6ec125a3de16.slice. Mar 25 01:33:55.481755 systemd[1]: Created slice kubepods-besteffort-podc2d89f5e_3072_423c_9b85_c270e603fd27.slice - libcontainer container kubepods-besteffort-podc2d89f5e_3072_423c_9b85_c270e603fd27.slice. Mar 25 01:33:55.495003 systemd[1]: Created slice kubepods-besteffort-pod6685e276_da04_413c_850f_cc7ff0494084.slice - libcontainer container kubepods-besteffort-pod6685e276_da04_413c_850f_cc7ff0494084.slice. Mar 25 01:33:55.507193 systemd[1]: Created slice kubepods-burstable-pod52021031_372c_4e64_8acc_2655c18d3f55.slice - libcontainer container kubepods-burstable-pod52021031_372c_4e64_8acc_2655c18d3f55.slice. Mar 25 01:33:55.519620 systemd[1]: Created slice kubepods-besteffort-podd23f16c0_841b_4e22_af15_6a6e81adc9e9.slice - libcontainer container kubepods-besteffort-podd23f16c0_841b_4e22_af15_6a6e81adc9e9.slice. Mar 25 01:33:55.779360 containerd[1528]: time="2025-03-25T01:33:55.779242139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5nkq,Uid:19db9a96-b50d-4635-8125-6ec125a3de16,Namespace:kube-system,Attempt:0,}" Mar 25 01:33:55.789221 containerd[1528]: time="2025-03-25T01:33:55.788943788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b47pq,Uid:c2d89f5e-3072-423c-9b85-c270e603fd27,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:33:55.807328 containerd[1528]: time="2025-03-25T01:33:55.807274134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-849f5dfb9c-6lwwx,Uid:6685e276-da04-413c-850f-cc7ff0494084,Namespace:calico-system,Attempt:0,}" Mar 25 01:33:55.817407 containerd[1528]: time="2025-03-25T01:33:55.817276613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5slnx,Uid:52021031-372c-4e64-8acc-2655c18d3f55,Namespace:kube-system,Attempt:0,}" Mar 25 01:33:55.829077 containerd[1528]: time="2025-03-25T01:33:55.829041053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b5srm,Uid:d23f16c0-841b-4e22-af15-6a6e81adc9e9,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:33:56.066294 containerd[1528]: time="2025-03-25T01:33:56.065935366Z" level=error msg="Failed to destroy network for sandbox \"346beb8c8f6fa378a257d0a7db22c63cfbd75afe356f53d3c21a6b3e8faa7486\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.066707 containerd[1528]: time="2025-03-25T01:33:56.066563533Z" level=error msg="Failed to destroy network for sandbox \"1fc15b59ddb1cebb22016a206883604ce945fe3848925cb19632daf7228e5588\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.093774 containerd[1528]: time="2025-03-25T01:33:56.074269033Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5slnx,Uid:52021031-372c-4e64-8acc-2655c18d3f55,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fc15b59ddb1cebb22016a206883604ce945fe3848925cb19632daf7228e5588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.094352 containerd[1528]: time="2025-03-25T01:33:56.081464085Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b5srm,Uid:d23f16c0-841b-4e22-af15-6a6e81adc9e9,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"346beb8c8f6fa378a257d0a7db22c63cfbd75afe356f53d3c21a6b3e8faa7486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.105547 containerd[1528]: time="2025-03-25T01:33:56.093577496Z" level=error msg="Failed to destroy network for sandbox \"e2ddee6ac4e93901ec7dabd86d6f536c9bff4b711a2e076057637f02f66bc8ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.106922 containerd[1528]: time="2025-03-25T01:33:56.106721470Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5nkq,Uid:19db9a96-b50d-4635-8125-6ec125a3de16,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2ddee6ac4e93901ec7dabd86d6f536c9bff4b711a2e076057637f02f66bc8ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.106922 containerd[1528]: time="2025-03-25T01:33:56.093635544Z" level=error msg="Failed to destroy network for sandbox \"1cebd140e9f2ca0c9b422535c8efc758fd22236ebc995dedf633d5a226e9e6da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.108461 containerd[1528]: time="2025-03-25T01:33:56.108225409Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b47pq,Uid:c2d89f5e-3072-423c-9b85-c270e603fd27,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cebd140e9f2ca0c9b422535c8efc758fd22236ebc995dedf633d5a226e9e6da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.108461 containerd[1528]: time="2025-03-25T01:33:56.108366095Z" level=error msg="Failed to destroy network for sandbox \"5061c7de93a024c4936d0f59a0741ed3cc3571f12b37da8871a17ba3a0255a3c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.110161 containerd[1528]: time="2025-03-25T01:33:56.110096413Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-849f5dfb9c-6lwwx,Uid:6685e276-da04-413c-850f-cc7ff0494084,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5061c7de93a024c4936d0f59a0741ed3cc3571f12b37da8871a17ba3a0255a3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.112772 systemd[1]: Created slice kubepods-besteffort-pod8cfce3bd_710f_418f_a9b3_28030dc77aba.slice - libcontainer container kubepods-besteffort-pod8cfce3bd_710f_418f_a9b3_28030dc77aba.slice. Mar 25 01:33:56.117587 kubelet[2918]: E0325 01:33:56.107374 2918 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"346beb8c8f6fa378a257d0a7db22c63cfbd75afe356f53d3c21a6b3e8faa7486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.117587 kubelet[2918]: E0325 01:33:56.107374 2918 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2ddee6ac4e93901ec7dabd86d6f536c9bff4b711a2e076057637f02f66bc8ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.117834 kubelet[2918]: E0325 01:33:56.117606 2918 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fc15b59ddb1cebb22016a206883604ce945fe3848925cb19632daf7228e5588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.120504 kubelet[2918]: E0325 01:33:56.119506 2918 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5061c7de93a024c4936d0f59a0741ed3cc3571f12b37da8871a17ba3a0255a3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.120504 kubelet[2918]: E0325 01:33:56.119969 2918 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cebd140e9f2ca0c9b422535c8efc758fd22236ebc995dedf633d5a226e9e6da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.120504 kubelet[2918]: E0325 01:33:56.120212 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"346beb8c8f6fa378a257d0a7db22c63cfbd75afe356f53d3c21a6b3e8faa7486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f45686d89-b5srm" Mar 25 01:33:56.120504 kubelet[2918]: E0325 01:33:56.120260 2918 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"346beb8c8f6fa378a257d0a7db22c63cfbd75afe356f53d3c21a6b3e8faa7486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f45686d89-b5srm" Mar 25 01:33:56.120772 kubelet[2918]: E0325 01:33:56.120334 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f45686d89-b5srm_calico-apiserver(d23f16c0-841b-4e22-af15-6a6e81adc9e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f45686d89-b5srm_calico-apiserver(d23f16c0-841b-4e22-af15-6a6e81adc9e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"346beb8c8f6fa378a257d0a7db22c63cfbd75afe356f53d3c21a6b3e8faa7486\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f45686d89-b5srm" podUID="d23f16c0-841b-4e22-af15-6a6e81adc9e9" Mar 25 01:33:56.122419 kubelet[2918]: E0325 01:33:56.122275 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5061c7de93a024c4936d0f59a0741ed3cc3571f12b37da8871a17ba3a0255a3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-849f5dfb9c-6lwwx" Mar 25 01:33:56.122540 kubelet[2918]: E0325 01:33:56.122422 2918 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5061c7de93a024c4936d0f59a0741ed3cc3571f12b37da8871a17ba3a0255a3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-849f5dfb9c-6lwwx" Mar 25 01:33:56.122605 kubelet[2918]: E0325 01:33:56.122532 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-849f5dfb9c-6lwwx_calico-system(6685e276-da04-413c-850f-cc7ff0494084)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-849f5dfb9c-6lwwx_calico-system(6685e276-da04-413c-850f-cc7ff0494084)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5061c7de93a024c4936d0f59a0741ed3cc3571f12b37da8871a17ba3a0255a3c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-849f5dfb9c-6lwwx" podUID="6685e276-da04-413c-850f-cc7ff0494084" Mar 25 01:33:56.122748 kubelet[2918]: E0325 01:33:56.122609 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2ddee6ac4e93901ec7dabd86d6f536c9bff4b711a2e076057637f02f66bc8ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-m5nkq" Mar 25 01:33:56.122748 kubelet[2918]: E0325 01:33:56.122653 2918 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2ddee6ac4e93901ec7dabd86d6f536c9bff4b711a2e076057637f02f66bc8ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-m5nkq" Mar 25 01:33:56.123449 kubelet[2918]: E0325 01:33:56.122738 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-m5nkq_kube-system(19db9a96-b50d-4635-8125-6ec125a3de16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-m5nkq_kube-system(19db9a96-b50d-4635-8125-6ec125a3de16)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e2ddee6ac4e93901ec7dabd86d6f536c9bff4b711a2e076057637f02f66bc8ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-m5nkq" podUID="19db9a96-b50d-4635-8125-6ec125a3de16" Mar 25 01:33:56.123449 kubelet[2918]: E0325 01:33:56.122788 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fc15b59ddb1cebb22016a206883604ce945fe3848925cb19632daf7228e5588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-5slnx" Mar 25 01:33:56.123449 kubelet[2918]: E0325 01:33:56.122865 2918 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fc15b59ddb1cebb22016a206883604ce945fe3848925cb19632daf7228e5588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-5slnx" Mar 25 01:33:56.123663 kubelet[2918]: E0325 01:33:56.122923 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-5slnx_kube-system(52021031-372c-4e64-8acc-2655c18d3f55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-5slnx_kube-system(52021031-372c-4e64-8acc-2655c18d3f55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1fc15b59ddb1cebb22016a206883604ce945fe3848925cb19632daf7228e5588\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-5slnx" podUID="52021031-372c-4e64-8acc-2655c18d3f55" Mar 25 01:33:56.123863 kubelet[2918]: E0325 01:33:56.123833 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cebd140e9f2ca0c9b422535c8efc758fd22236ebc995dedf633d5a226e9e6da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f45686d89-b47pq" Mar 25 01:33:56.123936 kubelet[2918]: E0325 01:33:56.123898 2918 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cebd140e9f2ca0c9b422535c8efc758fd22236ebc995dedf633d5a226e9e6da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f45686d89-b47pq" Mar 25 01:33:56.124599 kubelet[2918]: E0325 01:33:56.123983 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f45686d89-b47pq_calico-apiserver(c2d89f5e-3072-423c-9b85-c270e603fd27)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f45686d89-b47pq_calico-apiserver(c2d89f5e-3072-423c-9b85-c270e603fd27)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1cebd140e9f2ca0c9b422535c8efc758fd22236ebc995dedf633d5a226e9e6da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f45686d89-b47pq" podUID="c2d89f5e-3072-423c-9b85-c270e603fd27" Mar 25 01:33:56.125641 containerd[1528]: time="2025-03-25T01:33:56.125022930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zwppf,Uid:8cfce3bd-710f-418f-a9b3-28030dc77aba,Namespace:calico-system,Attempt:0,}" Mar 25 01:33:56.201456 containerd[1528]: time="2025-03-25T01:33:56.199576719Z" level=error msg="Failed to destroy network for sandbox \"e94e9842fd0d21b1f835599463c6d8a0f766d079434766ed481808d8e0d1bc0c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.213643 systemd[1]: run-netns-cni\x2d1a6e6edd\x2df44b\x2dfd7a\x2d5bc6\x2df1ad4271a2b0.mount: Deactivated successfully. Mar 25 01:33:56.220987 containerd[1528]: time="2025-03-25T01:33:56.220886144Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zwppf,Uid:8cfce3bd-710f-418f-a9b3-28030dc77aba,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e94e9842fd0d21b1f835599463c6d8a0f766d079434766ed481808d8e0d1bc0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.222362 kubelet[2918]: E0325 01:33:56.221212 2918 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e94e9842fd0d21b1f835599463c6d8a0f766d079434766ed481808d8e0d1bc0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:33:56.222362 kubelet[2918]: E0325 01:33:56.221280 2918 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e94e9842fd0d21b1f835599463c6d8a0f766d079434766ed481808d8e0d1bc0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:56.222362 kubelet[2918]: E0325 01:33:56.221317 2918 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e94e9842fd0d21b1f835599463c6d8a0f766d079434766ed481808d8e0d1bc0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zwppf" Mar 25 01:33:56.222674 kubelet[2918]: E0325 01:33:56.221393 2918 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zwppf_calico-system(8cfce3bd-710f-418f-a9b3-28030dc77aba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zwppf_calico-system(8cfce3bd-710f-418f-a9b3-28030dc77aba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e94e9842fd0d21b1f835599463c6d8a0f766d079434766ed481808d8e0d1bc0c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zwppf" podUID="8cfce3bd-710f-418f-a9b3-28030dc77aba" Mar 25 01:33:56.329609 containerd[1528]: time="2025-03-25T01:33:56.328999929Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:34:04.507285 kubelet[2918]: I0325 01:34:04.507126 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:34:05.492389 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1709398539.mount: Deactivated successfully. Mar 25 01:34:05.674795 containerd[1528]: time="2025-03-25T01:34:05.650185390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 25 01:34:05.679025 containerd[1528]: time="2025-03-25T01:34:05.678974523Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:05.702253 containerd[1528]: time="2025-03-25T01:34:05.702152755Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:05.704488 containerd[1528]: time="2025-03-25T01:34:05.703780774Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:05.704488 containerd[1528]: time="2025-03-25T01:34:05.704258119Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 9.374133969s" Mar 25 01:34:05.708876 containerd[1528]: time="2025-03-25T01:34:05.708821213Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 25 01:34:05.730855 containerd[1528]: time="2025-03-25T01:34:05.730807193Z" level=info msg="CreateContainer within sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:34:05.785070 containerd[1528]: time="2025-03-25T01:34:05.784714214Z" level=info msg="Container bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:05.792586 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1967562989.mount: Deactivated successfully. Mar 25 01:34:05.880398 containerd[1528]: time="2025-03-25T01:34:05.880285364Z" level=info msg="CreateContainer within sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\"" Mar 25 01:34:05.881844 containerd[1528]: time="2025-03-25T01:34:05.881813579Z" level=info msg="StartContainer for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\"" Mar 25 01:34:05.892479 containerd[1528]: time="2025-03-25T01:34:05.892413240Z" level=info msg="connecting to shim bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b" address="unix:///run/containerd/s/3bb73c60dd02c414850fae8312860640b4546291d21e826207ad978c81c16916" protocol=ttrpc version=3 Mar 25 01:34:06.031758 systemd[1]: Started cri-containerd-bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b.scope - libcontainer container bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b. Mar 25 01:34:06.125051 containerd[1528]: time="2025-03-25T01:34:06.124921199Z" level=info msg="StartContainer for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" returns successfully" Mar 25 01:34:06.346270 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:34:06.347195 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:34:06.755770 containerd[1528]: time="2025-03-25T01:34:06.755682142Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" id:\"2b459f5a02e5fc138549853951fa796014d6fcd1366f73f7e1957c7088a57776\" pid:3871 exit_status:1 exited_at:{seconds:1742866446 nanos:753591426}" Mar 25 01:34:07.097901 containerd[1528]: time="2025-03-25T01:34:07.097730838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b5srm,Uid:d23f16c0-841b-4e22-af15-6a6e81adc9e9,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:34:07.459711 systemd-networkd[1432]: cali941b9249ed2: Link UP Mar 25 01:34:07.460241 systemd-networkd[1432]: cali941b9249ed2: Gained carrier Mar 25 01:34:07.488528 containerd[1528]: 2025-03-25 01:34:07.142 [INFO][3897] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:34:07.488528 containerd[1528]: 2025-03-25 01:34:07.192 [INFO][3897] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0 calico-apiserver-6f45686d89- calico-apiserver d23f16c0-841b-4e22-af15-6a6e81adc9e9 737 0 2025-03-25 01:33:42 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f45686d89 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com calico-apiserver-6f45686d89-b5srm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali941b9249ed2 [] []}} ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-" Mar 25 01:34:07.488528 containerd[1528]: 2025-03-25 01:34:07.192 [INFO][3897] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.488528 containerd[1528]: 2025-03-25 01:34:07.370 [INFO][3908] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" HandleID="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.390 [INFO][3908] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" HandleID="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003c9540), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"calico-apiserver-6f45686d89-b5srm", "timestamp":"2025-03-25 01:34:07.370095496 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.390 [INFO][3908] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.391 [INFO][3908] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.391 [INFO][3908] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.394 [INFO][3908] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.404 [INFO][3908] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.410 [INFO][3908] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.413 [INFO][3908] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.489394 containerd[1528]: 2025-03-25 01:34:07.416 [INFO][3908] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.416 [INFO][3908] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.418 [INFO][3908] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.424 [INFO][3908] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.432 [INFO][3908] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.65/26] block=192.168.91.64/26 handle="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.432 [INFO][3908] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.65/26] handle="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.432 [INFO][3908] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:07.491849 containerd[1528]: 2025-03-25 01:34:07.432 [INFO][3908] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.65/26] IPv6=[] ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" HandleID="k8s-pod-network.09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.493320 containerd[1528]: 2025-03-25 01:34:07.435 [INFO][3897] cni-plugin/k8s.go 386: Populated endpoint ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0", GenerateName:"calico-apiserver-6f45686d89-", Namespace:"calico-apiserver", SelfLink:"", UID:"d23f16c0-841b-4e22-af15-6a6e81adc9e9", ResourceVersion:"737", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f45686d89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-6f45686d89-b5srm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali941b9249ed2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:07.494012 containerd[1528]: 2025-03-25 01:34:07.436 [INFO][3897] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.65/32] ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.494012 containerd[1528]: 2025-03-25 01:34:07.436 [INFO][3897] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali941b9249ed2 ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.494012 containerd[1528]: 2025-03-25 01:34:07.451 [INFO][3897] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.494316 containerd[1528]: 2025-03-25 01:34:07.451 [INFO][3897] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0", GenerateName:"calico-apiserver-6f45686d89-", Namespace:"calico-apiserver", SelfLink:"", UID:"d23f16c0-841b-4e22-af15-6a6e81adc9e9", ResourceVersion:"737", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f45686d89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c", Pod:"calico-apiserver-6f45686d89-b5srm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali941b9249ed2", MAC:"66:3b:f6:1a:54:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:07.495744 containerd[1528]: 2025-03-25 01:34:07.476 [INFO][3897] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b5srm" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b5srm-eth0" Mar 25 01:34:07.502367 kubelet[2918]: I0325 01:34:07.499919 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xrwrw" podStartSLOduration=2.464003421 podStartE2EDuration="25.48576104s" podCreationTimestamp="2025-03-25 01:33:42 +0000 UTC" firstStartedPulling="2025-03-25 01:33:42.68815251 +0000 UTC m=+22.840762671" lastFinishedPulling="2025-03-25 01:34:05.70991013 +0000 UTC m=+45.862520290" observedRunningTime="2025-03-25 01:34:06.520906502 +0000 UTC m=+46.673516667" watchObservedRunningTime="2025-03-25 01:34:07.48576104 +0000 UTC m=+47.638371195" Mar 25 01:34:07.614735 containerd[1528]: time="2025-03-25T01:34:07.609712956Z" level=info msg="connecting to shim 09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c" address="unix:///run/containerd/s/c192be8051dcd08708305150cabed649c6480a75c85ee5457ba98eba21ae4353" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:07.653648 systemd[1]: Started cri-containerd-09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c.scope - libcontainer container 09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c. Mar 25 01:34:07.671729 containerd[1528]: time="2025-03-25T01:34:07.671641855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" id:\"83ad7584509033fd528f924cbf9a57c464140f945fd4dbb9fff85477a1107d18\" pid:3940 exit_status:1 exited_at:{seconds:1742866447 nanos:670533029}" Mar 25 01:34:07.732968 containerd[1528]: time="2025-03-25T01:34:07.732760953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b5srm,Uid:d23f16c0-841b-4e22-af15-6a6e81adc9e9,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c\"" Mar 25 01:34:07.736991 containerd[1528]: time="2025-03-25T01:34:07.736793878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:34:08.422492 kernel: bpftool[4116]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:34:08.747778 systemd-networkd[1432]: vxlan.calico: Link UP Mar 25 01:34:08.747789 systemd-networkd[1432]: vxlan.calico: Gained carrier Mar 25 01:34:09.100821 containerd[1528]: time="2025-03-25T01:34:09.099134282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b47pq,Uid:c2d89f5e-3072-423c-9b85-c270e603fd27,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:34:09.100821 containerd[1528]: time="2025-03-25T01:34:09.100598071Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5nkq,Uid:19db9a96-b50d-4635-8125-6ec125a3de16,Namespace:kube-system,Attempt:0,}" Mar 25 01:34:09.102712 containerd[1528]: time="2025-03-25T01:34:09.102062457Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5slnx,Uid:52021031-372c-4e64-8acc-2655c18d3f55,Namespace:kube-system,Attempt:0,}" Mar 25 01:34:09.319277 systemd-networkd[1432]: cali941b9249ed2: Gained IPv6LL Mar 25 01:34:09.566599 systemd-networkd[1432]: cali9edda3f4a45: Link UP Mar 25 01:34:09.570908 systemd-networkd[1432]: cali9edda3f4a45: Gained carrier Mar 25 01:34:09.613220 containerd[1528]: 2025-03-25 01:34:09.317 [INFO][4201] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0 coredns-7db6d8ff4d- kube-system 52021031-372c-4e64-8acc-2655c18d3f55 738 0 2025-03-25 01:33:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com coredns-7db6d8ff4d-5slnx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9edda3f4a45 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-" Mar 25 01:34:09.613220 containerd[1528]: 2025-03-25 01:34:09.317 [INFO][4201] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.613220 containerd[1528]: 2025-03-25 01:34:09.454 [INFO][4234] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" HandleID="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Workload="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.481 [INFO][4234] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" HandleID="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Workload="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004036b0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"coredns-7db6d8ff4d-5slnx", "timestamp":"2025-03-25 01:34:09.454074567 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.481 [INFO][4234] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.481 [INFO][4234] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.481 [INFO][4234] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.486 [INFO][4234] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.503 [INFO][4234] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.512 [INFO][4234] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.515 [INFO][4234] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.613768 containerd[1528]: 2025-03-25 01:34:09.518 [INFO][4234] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.518 [INFO][4234] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.521 [INFO][4234] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.528 [INFO][4234] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.539 [INFO][4234] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.66/26] block=192.168.91.64/26 handle="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.539 [INFO][4234] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.66/26] handle="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.539 [INFO][4234] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:09.615691 containerd[1528]: 2025-03-25 01:34:09.539 [INFO][4234] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.66/26] IPv6=[] ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" HandleID="k8s-pod-network.b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Workload="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.616172 containerd[1528]: 2025-03-25 01:34:09.554 [INFO][4201] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"52021031-372c-4e64-8acc-2655c18d3f55", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7db6d8ff4d-5slnx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9edda3f4a45", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:09.616172 containerd[1528]: 2025-03-25 01:34:09.555 [INFO][4201] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.66/32] ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.616172 containerd[1528]: 2025-03-25 01:34:09.555 [INFO][4201] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9edda3f4a45 ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.616172 containerd[1528]: 2025-03-25 01:34:09.569 [INFO][4201] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.616172 containerd[1528]: 2025-03-25 01:34:09.570 [INFO][4201] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"52021031-372c-4e64-8acc-2655c18d3f55", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed", Pod:"coredns-7db6d8ff4d-5slnx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9edda3f4a45", MAC:"da:c3:f1:23:e8:f6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:09.616172 containerd[1528]: 2025-03-25 01:34:09.605 [INFO][4201] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5slnx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--5slnx-eth0" Mar 25 01:34:09.658307 systemd-networkd[1432]: cali7581a7b5874: Link UP Mar 25 01:34:09.662861 systemd-networkd[1432]: cali7581a7b5874: Gained carrier Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.358 [INFO][4187] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0 calico-apiserver-6f45686d89- calico-apiserver c2d89f5e-3072-423c-9b85-c270e603fd27 736 0 2025-03-25 01:33:42 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f45686d89 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com calico-apiserver-6f45686d89-b47pq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7581a7b5874 [] []}} ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.359 [INFO][4187] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.459 [INFO][4240] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" HandleID="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.490 [INFO][4240] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" HandleID="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031f140), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"calico-apiserver-6f45686d89-b47pq", "timestamp":"2025-03-25 01:34:09.459586101 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.491 [INFO][4240] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.540 [INFO][4240] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.540 [INFO][4240] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.548 [INFO][4240] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.557 [INFO][4240] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.576 [INFO][4240] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.582 [INFO][4240] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.590 [INFO][4240] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.591 [INFO][4240] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.606 [INFO][4240] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.618 [INFO][4240] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.634 [INFO][4240] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.67/26] block=192.168.91.64/26 handle="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.634 [INFO][4240] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.67/26] handle="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.636 [INFO][4240] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:09.722271 containerd[1528]: 2025-03-25 01:34:09.636 [INFO][4240] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.67/26] IPv6=[] ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" HandleID="k8s-pod-network.bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.723841 containerd[1528]: 2025-03-25 01:34:09.643 [INFO][4187] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0", GenerateName:"calico-apiserver-6f45686d89-", Namespace:"calico-apiserver", SelfLink:"", UID:"c2d89f5e-3072-423c-9b85-c270e603fd27", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f45686d89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-6f45686d89-b47pq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7581a7b5874", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:09.723841 containerd[1528]: 2025-03-25 01:34:09.643 [INFO][4187] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.67/32] ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.723841 containerd[1528]: 2025-03-25 01:34:09.643 [INFO][4187] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7581a7b5874 ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.723841 containerd[1528]: 2025-03-25 01:34:09.670 [INFO][4187] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.723841 containerd[1528]: 2025-03-25 01:34:09.680 [INFO][4187] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0", GenerateName:"calico-apiserver-6f45686d89-", Namespace:"calico-apiserver", SelfLink:"", UID:"c2d89f5e-3072-423c-9b85-c270e603fd27", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f45686d89", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d", Pod:"calico-apiserver-6f45686d89-b47pq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7581a7b5874", MAC:"7a:d7:a3:09:a6:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:09.723841 containerd[1528]: 2025-03-25 01:34:09.711 [INFO][4187] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" Namespace="calico-apiserver" Pod="calico-apiserver-6f45686d89-b47pq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--apiserver--6f45686d89--b47pq-eth0" Mar 25 01:34:09.739479 containerd[1528]: time="2025-03-25T01:34:09.738986394Z" level=info msg="connecting to shim b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed" address="unix:///run/containerd/s/2a85192aaf57506faf6d1f23a985efe8331a5ad8443e1341130c3d53ab4996d7" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:09.790890 systemd-networkd[1432]: calib46545142c6: Link UP Mar 25 01:34:09.791190 systemd-networkd[1432]: calib46545142c6: Gained carrier Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.359 [INFO][4182] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0 coredns-7db6d8ff4d- kube-system 19db9a96-b50d-4635-8125-6ec125a3de16 734 0 2025-03-25 01:33:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com coredns-7db6d8ff4d-m5nkq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib46545142c6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.360 [INFO][4182] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.491 [INFO][4245] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" HandleID="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Workload="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.508 [INFO][4245] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" HandleID="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Workload="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002657a0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"coredns-7db6d8ff4d-m5nkq", "timestamp":"2025-03-25 01:34:09.491460576 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.508 [INFO][4245] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.636 [INFO][4245] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.637 [INFO][4245] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.641 [INFO][4245] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.659 [INFO][4245] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.694 [INFO][4245] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.708 [INFO][4245] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.716 [INFO][4245] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.716 [INFO][4245] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.724 [INFO][4245] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941 Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.737 [INFO][4245] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.755 [INFO][4245] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.68/26] block=192.168.91.64/26 handle="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.755 [INFO][4245] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.68/26] handle="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.759 [INFO][4245] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:09.832260 containerd[1528]: 2025-03-25 01:34:09.759 [INFO][4245] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.68/26] IPv6=[] ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" HandleID="k8s-pod-network.22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Workload="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.834786 containerd[1528]: 2025-03-25 01:34:09.769 [INFO][4182] cni-plugin/k8s.go 386: Populated endpoint ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"19db9a96-b50d-4635-8125-6ec125a3de16", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7db6d8ff4d-m5nkq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib46545142c6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:09.834786 containerd[1528]: 2025-03-25 01:34:09.776 [INFO][4182] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.68/32] ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.834786 containerd[1528]: 2025-03-25 01:34:09.776 [INFO][4182] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib46545142c6 ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.834786 containerd[1528]: 2025-03-25 01:34:09.792 [INFO][4182] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.834786 containerd[1528]: 2025-03-25 01:34:09.794 [INFO][4182] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"19db9a96-b50d-4635-8125-6ec125a3de16", ResourceVersion:"734", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941", Pod:"coredns-7db6d8ff4d-m5nkq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib46545142c6", MAC:"de:53:6d:c6:a3:d5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:09.834786 containerd[1528]: 2025-03-25 01:34:09.824 [INFO][4182] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5nkq" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--m5nkq-eth0" Mar 25 01:34:09.875723 containerd[1528]: time="2025-03-25T01:34:09.875064339Z" level=info msg="connecting to shim bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d" address="unix:///run/containerd/s/90869812a4f30eabd6ffd35fa8288b8527d8f87293b624b412a14c2bf040bcab" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:09.889657 systemd[1]: Started cri-containerd-b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed.scope - libcontainer container b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed. Mar 25 01:34:09.923582 containerd[1528]: time="2025-03-25T01:34:09.923016019Z" level=info msg="connecting to shim 22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941" address="unix:///run/containerd/s/4e15ebe4dfe7050c5815cfb6a84fa42444a21d4bf02491b686844d6bda9c4b7b" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:09.963640 systemd[1]: Started cri-containerd-bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d.scope - libcontainer container bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d. Mar 25 01:34:09.983631 systemd[1]: Started cri-containerd-22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941.scope - libcontainer container 22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941. Mar 25 01:34:10.068368 containerd[1528]: time="2025-03-25T01:34:10.068212623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5slnx,Uid:52021031-372c-4e64-8acc-2655c18d3f55,Namespace:kube-system,Attempt:0,} returns sandbox id \"b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed\"" Mar 25 01:34:10.080107 containerd[1528]: time="2025-03-25T01:34:10.078816295Z" level=info msg="CreateContainer within sandbox \"b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:34:10.087713 systemd-networkd[1432]: vxlan.calico: Gained IPv6LL Mar 25 01:34:10.119349 containerd[1528]: time="2025-03-25T01:34:10.119301531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-849f5dfb9c-6lwwx,Uid:6685e276-da04-413c-850f-cc7ff0494084,Namespace:calico-system,Attempt:0,}" Mar 25 01:34:10.124265 containerd[1528]: time="2025-03-25T01:34:10.124226023Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zwppf,Uid:8cfce3bd-710f-418f-a9b3-28030dc77aba,Namespace:calico-system,Attempt:0,}" Mar 25 01:34:10.167769 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1824002132.mount: Deactivated successfully. Mar 25 01:34:10.170770 containerd[1528]: time="2025-03-25T01:34:10.170516707Z" level=info msg="Container c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:10.208116 containerd[1528]: time="2025-03-25T01:34:10.208040277Z" level=info msg="CreateContainer within sandbox \"b2f60630749aa69ed5d0c70f0d0df02ad7fea783f2d2201bb8f03757465efaed\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354\"" Mar 25 01:34:10.211300 containerd[1528]: time="2025-03-25T01:34:10.211261534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5nkq,Uid:19db9a96-b50d-4635-8125-6ec125a3de16,Namespace:kube-system,Attempt:0,} returns sandbox id \"22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941\"" Mar 25 01:34:10.213078 containerd[1528]: time="2025-03-25T01:34:10.211285098Z" level=info msg="StartContainer for \"c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354\"" Mar 25 01:34:10.218962 containerd[1528]: time="2025-03-25T01:34:10.216409298Z" level=info msg="connecting to shim c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354" address="unix:///run/containerd/s/2a85192aaf57506faf6d1f23a985efe8331a5ad8443e1341130c3d53ab4996d7" protocol=ttrpc version=3 Mar 25 01:34:10.224744 containerd[1528]: time="2025-03-25T01:34:10.224699668Z" level=info msg="CreateContainer within sandbox \"22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:34:10.246458 containerd[1528]: time="2025-03-25T01:34:10.246309092Z" level=info msg="Container 027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:10.264802 containerd[1528]: time="2025-03-25T01:34:10.264666460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f45686d89-b47pq,Uid:c2d89f5e-3072-423c-9b85-c270e603fd27,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d\"" Mar 25 01:34:10.271679 systemd[1]: Started cri-containerd-c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354.scope - libcontainer container c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354. Mar 25 01:34:10.283595 containerd[1528]: time="2025-03-25T01:34:10.282489682Z" level=info msg="CreateContainer within sandbox \"22c02fbeb05c0713fd0cf8696d870c99468b93a8b59afe501900f00e284c3941\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647\"" Mar 25 01:34:10.285157 containerd[1528]: time="2025-03-25T01:34:10.285123922Z" level=info msg="StartContainer for \"027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647\"" Mar 25 01:34:10.292404 containerd[1528]: time="2025-03-25T01:34:10.292265576Z" level=info msg="connecting to shim 027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647" address="unix:///run/containerd/s/4e15ebe4dfe7050c5815cfb6a84fa42444a21d4bf02491b686844d6bda9c4b7b" protocol=ttrpc version=3 Mar 25 01:34:10.361863 systemd[1]: Started cri-containerd-027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647.scope - libcontainer container 027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647. Mar 25 01:34:10.461982 containerd[1528]: time="2025-03-25T01:34:10.461929840Z" level=info msg="StartContainer for \"c6e23a4abb2968a09ed8990ecc87df937575ed3bcee70ae17a870f9c33a6f354\" returns successfully" Mar 25 01:34:10.479274 containerd[1528]: time="2025-03-25T01:34:10.477695560Z" level=info msg="StartContainer for \"027e1fd1f0336f50d8479db354768ebeaecedd3e9989f598ec5341d2676d9647\" returns successfully" Mar 25 01:34:10.623292 kubelet[2918]: I0325 01:34:10.623096 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-m5nkq" podStartSLOduration=36.623038853 podStartE2EDuration="36.623038853s" podCreationTimestamp="2025-03-25 01:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:34:10.622837903 +0000 UTC m=+50.775448084" watchObservedRunningTime="2025-03-25 01:34:10.623038853 +0000 UTC m=+50.775649039" Mar 25 01:34:10.625845 kubelet[2918]: I0325 01:34:10.624308 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-5slnx" podStartSLOduration=36.623263643 podStartE2EDuration="36.623263643s" podCreationTimestamp="2025-03-25 01:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:34:10.576931611 +0000 UTC m=+50.729541770" watchObservedRunningTime="2025-03-25 01:34:10.623263643 +0000 UTC m=+50.775873806" Mar 25 01:34:10.835953 systemd-networkd[1432]: calib44b6509e82: Link UP Mar 25 01:34:10.840007 systemd-networkd[1432]: calib44b6509e82: Gained carrier Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.368 [INFO][4422] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0 calico-kube-controllers-849f5dfb9c- calico-system 6685e276-da04-413c-850f-cc7ff0494084 735 0 2025-03-25 01:33:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:849f5dfb9c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com calico-kube-controllers-849f5dfb9c-6lwwx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib44b6509e82 [] []}} ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.368 [INFO][4422] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.552 [INFO][4503] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.600 [INFO][4503] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051a40), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"calico-kube-controllers-849f5dfb9c-6lwwx", "timestamp":"2025-03-25 01:34:10.552894412 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.601 [INFO][4503] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.601 [INFO][4503] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.601 [INFO][4503] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.619 [INFO][4503] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.652 [INFO][4503] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.668 [INFO][4503] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.674 [INFO][4503] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.687 [INFO][4503] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.687 [INFO][4503] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.694 [INFO][4503] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.761 [INFO][4503] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.809 [INFO][4503] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.69/26] block=192.168.91.64/26 handle="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.809 [INFO][4503] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.69/26] handle="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.810 [INFO][4503] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:10.903641 containerd[1528]: 2025-03-25 01:34:10.810 [INFO][4503] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.69/26] IPv6=[] ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.906326 containerd[1528]: 2025-03-25 01:34:10.816 [INFO][4422] cni-plugin/k8s.go 386: Populated endpoint ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0", GenerateName:"calico-kube-controllers-849f5dfb9c-", Namespace:"calico-system", SelfLink:"", UID:"6685e276-da04-413c-850f-cc7ff0494084", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"849f5dfb9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-849f5dfb9c-6lwwx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib44b6509e82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:10.906326 containerd[1528]: 2025-03-25 01:34:10.816 [INFO][4422] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.69/32] ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.906326 containerd[1528]: 2025-03-25 01:34:10.816 [INFO][4422] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib44b6509e82 ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.906326 containerd[1528]: 2025-03-25 01:34:10.840 [INFO][4422] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.906326 containerd[1528]: 2025-03-25 01:34:10.843 [INFO][4422] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0", GenerateName:"calico-kube-controllers-849f5dfb9c-", Namespace:"calico-system", SelfLink:"", UID:"6685e276-da04-413c-850f-cc7ff0494084", ResourceVersion:"735", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"849f5dfb9c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe", Pod:"calico-kube-controllers-849f5dfb9c-6lwwx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib44b6509e82", MAC:"fe:b8:07:46:32:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:10.906326 containerd[1528]: 2025-03-25 01:34:10.897 [INFO][4422] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Namespace="calico-system" Pod="calico-kube-controllers-849f5dfb9c-6lwwx" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:10.995493 containerd[1528]: time="2025-03-25T01:34:10.995298954Z" level=info msg="connecting to shim 249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" address="unix:///run/containerd/s/793aea622db913bf7cd9d4753bddb8967c45c10e92674c5fe07298c638f21646" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:11.053693 systemd[1]: Started cri-containerd-249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe.scope - libcontainer container 249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe. Mar 25 01:34:11.073633 systemd-networkd[1432]: calief56e2042ae: Link UP Mar 25 01:34:11.085354 systemd-networkd[1432]: calief56e2042ae: Gained carrier Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.360 [INFO][4437] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0 csi-node-driver- calico-system 8cfce3bd-710f-418f-a9b3-28030dc77aba 609 0 2025-03-25 01:33:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com csi-node-driver-zwppf eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calief56e2042ae [] []}} ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.362 [INFO][4437] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.552 [INFO][4510] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" HandleID="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Workload="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.604 [INFO][4510] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" HandleID="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Workload="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000292930), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"csi-node-driver-zwppf", "timestamp":"2025-03-25 01:34:10.552387315 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.604 [INFO][4510] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.810 [INFO][4510] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.810 [INFO][4510] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.822 [INFO][4510] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.869 [INFO][4510] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.899 [INFO][4510] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.906 [INFO][4510] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.917 [INFO][4510] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.918 [INFO][4510] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.923 [INFO][4510] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0 Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:10.948 [INFO][4510] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:11.034 [INFO][4510] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.70/26] block=192.168.91.64/26 handle="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:11.034 [INFO][4510] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.70/26] handle="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:11.038 [INFO][4510] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:11.201264 containerd[1528]: 2025-03-25 01:34:11.038 [INFO][4510] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.70/26] IPv6=[] ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" HandleID="k8s-pod-network.b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Workload="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.202978 containerd[1528]: 2025-03-25 01:34:11.043 [INFO][4437] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8cfce3bd-710f-418f-a9b3-28030dc77aba", ResourceVersion:"609", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-zwppf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calief56e2042ae", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:11.202978 containerd[1528]: 2025-03-25 01:34:11.045 [INFO][4437] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.70/32] ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.202978 containerd[1528]: 2025-03-25 01:34:11.045 [INFO][4437] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calief56e2042ae ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.202978 containerd[1528]: 2025-03-25 01:34:11.087 [INFO][4437] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.202978 containerd[1528]: 2025-03-25 01:34:11.090 [INFO][4437] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8cfce3bd-710f-418f-a9b3-28030dc77aba", ResourceVersion:"609", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 33, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0", Pod:"csi-node-driver-zwppf", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calief56e2042ae", MAC:"7e:b5:ad:5e:73:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:11.202978 containerd[1528]: 2025-03-25 01:34:11.197 [INFO][4437] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" Namespace="calico-system" Pod="csi-node-driver-zwppf" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-csi--node--driver--zwppf-eth0" Mar 25 01:34:11.314317 containerd[1528]: time="2025-03-25T01:34:11.314257918Z" level=info msg="connecting to shim b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0" address="unix:///run/containerd/s/8ccc0b83cef0841e4d69b18d3dc87596ec5864825be57a0237c7d10a5136d8ad" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:11.401662 systemd[1]: Started cri-containerd-b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0.scope - libcontainer container b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0. Mar 25 01:34:11.430593 systemd-networkd[1432]: cali7581a7b5874: Gained IPv6LL Mar 25 01:34:11.495181 systemd-networkd[1432]: cali9edda3f4a45: Gained IPv6LL Mar 25 01:34:11.498757 containerd[1528]: time="2025-03-25T01:34:11.498600836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-849f5dfb9c-6lwwx,Uid:6685e276-da04-413c-850f-cc7ff0494084,Namespace:calico-system,Attempt:0,} returns sandbox id \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\"" Mar 25 01:34:11.535961 containerd[1528]: time="2025-03-25T01:34:11.535879255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zwppf,Uid:8cfce3bd-710f-418f-a9b3-28030dc77aba,Namespace:calico-system,Attempt:0,} returns sandbox id \"b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0\"" Mar 25 01:34:11.687542 systemd-networkd[1432]: calib46545142c6: Gained IPv6LL Mar 25 01:34:12.519651 systemd-networkd[1432]: calief56e2042ae: Gained IPv6LL Mar 25 01:34:12.624579 containerd[1528]: time="2025-03-25T01:34:12.623722802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:12.625081 containerd[1528]: time="2025-03-25T01:34:12.624687783Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 25 01:34:12.625488 containerd[1528]: time="2025-03-25T01:34:12.625427358Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:12.628721 containerd[1528]: time="2025-03-25T01:34:12.628677006Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:12.629364 containerd[1528]: time="2025-03-25T01:34:12.629251303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 4.89240243s" Mar 25 01:34:12.629364 containerd[1528]: time="2025-03-25T01:34:12.629291710Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:34:12.635035 containerd[1528]: time="2025-03-25T01:34:12.634314933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:34:12.638610 containerd[1528]: time="2025-03-25T01:34:12.638518434Z" level=info msg="CreateContainer within sandbox \"09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:34:12.647548 containerd[1528]: time="2025-03-25T01:34:12.647020241Z" level=info msg="Container 24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:12.657240 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2966688670.mount: Deactivated successfully. Mar 25 01:34:12.661398 containerd[1528]: time="2025-03-25T01:34:12.661212940Z" level=info msg="CreateContainer within sandbox \"09a4500ebd383f68bade95216a1e8e3021e3f762994f8e27c7d7a63e676d433c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd\"" Mar 25 01:34:12.661854 containerd[1528]: time="2025-03-25T01:34:12.661777568Z" level=info msg="StartContainer for \"24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd\"" Mar 25 01:34:12.664255 containerd[1528]: time="2025-03-25T01:34:12.664223374Z" level=info msg="connecting to shim 24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd" address="unix:///run/containerd/s/c192be8051dcd08708305150cabed649c6480a75c85ee5457ba98eba21ae4353" protocol=ttrpc version=3 Mar 25 01:34:12.704682 systemd[1]: Started cri-containerd-24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd.scope - libcontainer container 24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd. Mar 25 01:34:12.710778 systemd-networkd[1432]: calib44b6509e82: Gained IPv6LL Mar 25 01:34:12.788924 containerd[1528]: time="2025-03-25T01:34:12.788778139Z" level=info msg="StartContainer for \"24e7581bcbd965cd1d401648cde4aaef8b65b56070ac49b6a3013da763dd71fd\" returns successfully" Mar 25 01:34:13.001630 containerd[1528]: time="2025-03-25T01:34:13.001563113Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:13.003538 containerd[1528]: time="2025-03-25T01:34:13.003161785Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:34:13.008903 containerd[1528]: time="2025-03-25T01:34:13.008139172Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 373.782009ms" Mar 25 01:34:13.008903 containerd[1528]: time="2025-03-25T01:34:13.008199756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:34:13.011629 containerd[1528]: time="2025-03-25T01:34:13.010553591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:34:13.016734 containerd[1528]: time="2025-03-25T01:34:13.016541007Z" level=info msg="CreateContainer within sandbox \"bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:34:13.030173 containerd[1528]: time="2025-03-25T01:34:13.029543282Z" level=info msg="Container 318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:13.072368 containerd[1528]: time="2025-03-25T01:34:13.072173886Z" level=info msg="CreateContainer within sandbox \"bb892026aa1442e6b4c2ce77ec94e3bbc0c705cdf4e1e862859c7ed15373017d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541\"" Mar 25 01:34:13.075029 containerd[1528]: time="2025-03-25T01:34:13.074768167Z" level=info msg="StartContainer for \"318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541\"" Mar 25 01:34:13.076721 containerd[1528]: time="2025-03-25T01:34:13.076688479Z" level=info msg="connecting to shim 318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541" address="unix:///run/containerd/s/90869812a4f30eabd6ffd35fa8288b8527d8f87293b624b412a14c2bf040bcab" protocol=ttrpc version=3 Mar 25 01:34:13.119082 systemd[1]: Started cri-containerd-318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541.scope - libcontainer container 318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541. Mar 25 01:34:13.228545 containerd[1528]: time="2025-03-25T01:34:13.228484017Z" level=info msg="StartContainer for \"318ed9c9d6b3b4ba869367a99741c01250371497ee0ad36239a1b956b2deb541\" returns successfully" Mar 25 01:34:13.585102 kubelet[2918]: I0325 01:34:13.584968 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6f45686d89-b5srm" podStartSLOduration=26.687465728 podStartE2EDuration="31.584943861s" podCreationTimestamp="2025-03-25 01:33:42 +0000 UTC" firstStartedPulling="2025-03-25 01:34:07.736349138 +0000 UTC m=+47.888959286" lastFinishedPulling="2025-03-25 01:34:12.633827252 +0000 UTC m=+52.786437419" observedRunningTime="2025-03-25 01:34:13.583199286 +0000 UTC m=+53.735809445" watchObservedRunningTime="2025-03-25 01:34:13.584943861 +0000 UTC m=+53.737554015" Mar 25 01:34:14.669561 kubelet[2918]: I0325 01:34:14.669459 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:34:15.174187 kubelet[2918]: I0325 01:34:15.174065 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6f45686d89-b47pq" podStartSLOduration=30.433919812 podStartE2EDuration="33.17242828s" podCreationTimestamp="2025-03-25 01:33:42 +0000 UTC" firstStartedPulling="2025-03-25 01:34:10.271510048 +0000 UTC m=+50.424120196" lastFinishedPulling="2025-03-25 01:34:13.010018499 +0000 UTC m=+53.162628664" observedRunningTime="2025-03-25 01:34:13.60080651 +0000 UTC m=+53.753416675" watchObservedRunningTime="2025-03-25 01:34:15.17242828 +0000 UTC m=+55.325038449" Mar 25 01:34:16.549152 containerd[1528]: time="2025-03-25T01:34:16.549040793Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:16.552570 containerd[1528]: time="2025-03-25T01:34:16.552493357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 25 01:34:16.554312 containerd[1528]: time="2025-03-25T01:34:16.553654014Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:16.557180 containerd[1528]: time="2025-03-25T01:34:16.557149699Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:16.559390 containerd[1528]: time="2025-03-25T01:34:16.559350734Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.548736356s" Mar 25 01:34:16.559557 containerd[1528]: time="2025-03-25T01:34:16.559528792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 25 01:34:16.563411 containerd[1528]: time="2025-03-25T01:34:16.562687669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:34:16.598369 containerd[1528]: time="2025-03-25T01:34:16.598322770Z" level=info msg="CreateContainer within sandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:34:16.622764 containerd[1528]: time="2025-03-25T01:34:16.622709711Z" level=info msg="Container 146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:16.632336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount522404679.mount: Deactivated successfully. Mar 25 01:34:16.642551 containerd[1528]: time="2025-03-25T01:34:16.642310163Z" level=info msg="CreateContainer within sandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\"" Mar 25 01:34:16.645082 containerd[1528]: time="2025-03-25T01:34:16.643242826Z" level=info msg="StartContainer for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\"" Mar 25 01:34:16.645613 containerd[1528]: time="2025-03-25T01:34:16.645571543Z" level=info msg="connecting to shim 146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea" address="unix:///run/containerd/s/793aea622db913bf7cd9d4753bddb8967c45c10e92674c5fe07298c638f21646" protocol=ttrpc version=3 Mar 25 01:34:16.702894 systemd[1]: Started cri-containerd-146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea.scope - libcontainer container 146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea. Mar 25 01:34:16.821103 containerd[1528]: time="2025-03-25T01:34:16.820925278Z" level=info msg="StartContainer for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" returns successfully" Mar 25 01:34:16.970121 containerd[1528]: time="2025-03-25T01:34:16.968772828Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" id:\"e06539a28d693d404422a6d42802318ead98b538209889e6c8014a3eee669518\" pid:4788 exited_at:{seconds:1742866456 nanos:967624975}" Mar 25 01:34:17.638476 kubelet[2918]: I0325 01:34:17.638327 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-849f5dfb9c-6lwwx" podStartSLOduration=30.577600605 podStartE2EDuration="35.638289742s" podCreationTimestamp="2025-03-25 01:33:42 +0000 UTC" firstStartedPulling="2025-03-25 01:34:11.501427282 +0000 UTC m=+51.654037438" lastFinishedPulling="2025-03-25 01:34:16.56211642 +0000 UTC m=+56.714726575" observedRunningTime="2025-03-25 01:34:17.637898909 +0000 UTC m=+57.790509085" watchObservedRunningTime="2025-03-25 01:34:17.638289742 +0000 UTC m=+57.790899906" Mar 25 01:34:17.698347 containerd[1528]: time="2025-03-25T01:34:17.698275812Z" level=info msg="TaskExit event in podsandbox handler container_id:\"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" id:\"077e7848ab29e74424fc4fb7e54545062b53520fc4d0ecf099922e3400f98587\" pid:4824 exited_at:{seconds:1742866457 nanos:697969450}" Mar 25 01:34:18.430015 containerd[1528]: time="2025-03-25T01:34:18.429539826Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:18.433576 containerd[1528]: time="2025-03-25T01:34:18.433512768Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 25 01:34:18.435088 containerd[1528]: time="2025-03-25T01:34:18.435032112Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:18.443304 containerd[1528]: time="2025-03-25T01:34:18.442125949Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:18.453403 containerd[1528]: time="2025-03-25T01:34:18.453360016Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.885794649s" Mar 25 01:34:18.453791 containerd[1528]: time="2025-03-25T01:34:18.453759420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 25 01:34:18.471993 containerd[1528]: time="2025-03-25T01:34:18.471876799Z" level=info msg="CreateContainer within sandbox \"b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:34:18.508203 containerd[1528]: time="2025-03-25T01:34:18.506632007Z" level=info msg="Container 514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:18.520303 containerd[1528]: time="2025-03-25T01:34:18.520023825Z" level=info msg="CreateContainer within sandbox \"b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c\"" Mar 25 01:34:18.522660 containerd[1528]: time="2025-03-25T01:34:18.522419419Z" level=info msg="StartContainer for \"514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c\"" Mar 25 01:34:18.525749 containerd[1528]: time="2025-03-25T01:34:18.525363383Z" level=info msg="connecting to shim 514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c" address="unix:///run/containerd/s/8ccc0b83cef0841e4d69b18d3dc87596ec5864825be57a0237c7d10a5136d8ad" protocol=ttrpc version=3 Mar 25 01:34:18.543157 containerd[1528]: time="2025-03-25T01:34:18.542847947Z" level=info msg="StopContainer for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" with timeout 300 (s)" Mar 25 01:34:18.562825 containerd[1528]: time="2025-03-25T01:34:18.562581991Z" level=info msg="Stop container \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" with signal terminated" Mar 25 01:34:18.646820 systemd[1]: Started cri-containerd-514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c.scope - libcontainer container 514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c. Mar 25 01:34:18.874156 containerd[1528]: time="2025-03-25T01:34:18.872792383Z" level=info msg="StartContainer for \"514b0aae56f4c236b1bc1d81713af6373b5ca6bcc2da55005a39829e7e1c7a2c\" returns successfully" Mar 25 01:34:18.876757 containerd[1528]: time="2025-03-25T01:34:18.876245530Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:34:18.961800 containerd[1528]: time="2025-03-25T01:34:18.961583572Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" id:\"0b4195d24d0638dc58db6e2d84df272ec6a08b5f69647ada7bf589224a907e83\" pid:4872 exited_at:{seconds:1742866458 nanos:959755544}" Mar 25 01:34:18.966960 containerd[1528]: time="2025-03-25T01:34:18.966865559Z" level=info msg="StopContainer for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" with timeout 5 (s)" Mar 25 01:34:18.968399 containerd[1528]: time="2025-03-25T01:34:18.968260877Z" level=info msg="Stop container \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" with signal terminated" Mar 25 01:34:19.029455 systemd[1]: cri-containerd-bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b.scope: Deactivated successfully. Mar 25 01:34:19.030210 systemd[1]: cri-containerd-bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b.scope: Consumed 2.322s CPU time, 181M memory peak, 36.7M read from disk, 644K written to disk. Mar 25 01:34:19.040518 containerd[1528]: time="2025-03-25T01:34:19.037789028Z" level=info msg="received exit event container_id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" pid:3819 exited_at:{seconds:1742866459 nanos:37080211}" Mar 25 01:34:19.040518 containerd[1528]: time="2025-03-25T01:34:19.037821947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" id:\"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" pid:3819 exited_at:{seconds:1742866459 nanos:37080211}" Mar 25 01:34:19.099763 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b-rootfs.mount: Deactivated successfully. Mar 25 01:34:19.153296 containerd[1528]: time="2025-03-25T01:34:19.152260596Z" level=info msg="StopContainer for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" returns successfully" Mar 25 01:34:19.175987 containerd[1528]: time="2025-03-25T01:34:19.175828062Z" level=info msg="StopPodSandbox for \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\"" Mar 25 01:34:19.187000 containerd[1528]: time="2025-03-25T01:34:19.186905307Z" level=info msg="Container to stop \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 01:34:19.187000 containerd[1528]: time="2025-03-25T01:34:19.186980109Z" level=info msg="Container to stop \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 01:34:19.187000 containerd[1528]: time="2025-03-25T01:34:19.187001476Z" level=info msg="Container to stop \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 01:34:19.241937 systemd[1]: cri-containerd-ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f.scope: Deactivated successfully. Mar 25 01:34:19.244192 containerd[1528]: time="2025-03-25T01:34:19.242756047Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" id:\"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" pid:3444 exit_status:137 exited_at:{seconds:1742866459 nanos:242211779}" Mar 25 01:34:19.303312 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f-rootfs.mount: Deactivated successfully. Mar 25 01:34:19.307237 containerd[1528]: time="2025-03-25T01:34:19.306911996Z" level=info msg="shim disconnected" id=ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f namespace=k8s.io Mar 25 01:34:19.307237 containerd[1528]: time="2025-03-25T01:34:19.306953736Z" level=warning msg="cleaning up after shim disconnected" id=ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f namespace=k8s.io Mar 25 01:34:19.326534 containerd[1528]: time="2025-03-25T01:34:19.306969946Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 25 01:34:19.475466 containerd[1528]: time="2025-03-25T01:34:19.473707341Z" level=info msg="received exit event sandbox_id:\"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" exit_status:137 exited_at:{seconds:1742866459 nanos:242211779}" Mar 25 01:34:19.476322 containerd[1528]: time="2025-03-25T01:34:19.475870296Z" level=info msg="TearDown network for sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" successfully" Mar 25 01:34:19.476322 containerd[1528]: time="2025-03-25T01:34:19.475903097Z" level=info msg="StopPodSandbox for \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" returns successfully" Mar 25 01:34:19.511232 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f-shm.mount: Deactivated successfully. Mar 25 01:34:19.589482 kubelet[2918]: I0325 01:34:19.589380 2918 topology_manager.go:215] "Topology Admit Handler" podUID="4fc43466-026d-4115-aa03-cae9bde3cb59" podNamespace="calico-system" podName="calico-node-m2l8g" Mar 25 01:34:19.594268 kubelet[2918]: E0325 01:34:19.594232 2918 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" containerName="install-cni" Mar 25 01:34:19.595174 kubelet[2918]: E0325 01:34:19.594275 2918 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" containerName="flexvol-driver" Mar 25 01:34:19.595174 kubelet[2918]: E0325 01:34:19.594289 2918 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" containerName="calico-node" Mar 25 01:34:19.595174 kubelet[2918]: I0325 01:34:19.594956 2918 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" containerName="calico-node" Mar 25 01:34:19.638177 systemd[1]: Created slice kubepods-besteffort-pod4fc43466_026d_4115_aa03_cae9bde3cb59.slice - libcontainer container kubepods-besteffort-pod4fc43466_026d_4115_aa03_cae9bde3cb59.slice. Mar 25 01:34:19.671225 kubelet[2918]: I0325 01:34:19.670524 2918 scope.go:117] "RemoveContainer" containerID="bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b" Mar 25 01:34:19.675546 kubelet[2918]: I0325 01:34:19.674919 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-log-dir\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.675546 kubelet[2918]: I0325 01:34:19.675064 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-run-calico\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.682170 kubelet[2918]: I0325 01:34:19.679818 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.683362 kubelet[2918]: I0325 01:34:19.679818 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.683362 kubelet[2918]: I0325 01:34:19.681567 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.683362 kubelet[2918]: I0325 01:34:19.681527 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-lib-calico\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683362 kubelet[2918]: I0325 01:34:19.682413 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-xtables-lock\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683362 kubelet[2918]: I0325 01:34:19.682459 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-policysync\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683362 kubelet[2918]: I0325 01:34:19.682485 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-bin-dir\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683736 kubelet[2918]: I0325 01:34:19.682518 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-tigera-ca-bundle\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683736 kubelet[2918]: I0325 01:34:19.682547 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmxbk\" (UniqueName: \"kubernetes.io/projected/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-kube-api-access-mmxbk\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683736 kubelet[2918]: I0325 01:34:19.682585 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-node-certs\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683736 kubelet[2918]: I0325 01:34:19.682617 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-flexvol-driver-host\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683736 kubelet[2918]: I0325 01:34:19.682680 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-lib-modules\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.683736 kubelet[2918]: I0325 01:34:19.682719 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-net-dir\") pod \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\" (UID: \"d3b4fc6b-6a13-424a-a46e-4552d4b72a7a\") " Mar 25 01:34:19.684907 kubelet[2918]: I0325 01:34:19.684259 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-policysync\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.684907 kubelet[2918]: I0325 01:34:19.684320 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-xtables-lock\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.684907 kubelet[2918]: I0325 01:34:19.684348 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-cni-net-dir\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.684907 kubelet[2918]: I0325 01:34:19.684386 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-cni-log-dir\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.684907 kubelet[2918]: I0325 01:34:19.684415 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-flexvol-driver-host\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.686594 kubelet[2918]: I0325 01:34:19.684440 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fc43466-026d-4115-aa03-cae9bde3cb59-tigera-ca-bundle\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.686594 kubelet[2918]: I0325 01:34:19.684483 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/4fc43466-026d-4115-aa03-cae9bde3cb59-node-certs\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.686594 kubelet[2918]: I0325 01:34:19.684514 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-lib-modules\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.686594 kubelet[2918]: I0325 01:34:19.684546 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7zzz\" (UniqueName: \"kubernetes.io/projected/4fc43466-026d-4115-aa03-cae9bde3cb59-kube-api-access-p7zzz\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.686594 kubelet[2918]: I0325 01:34:19.684576 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-var-run-calico\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.688311 containerd[1528]: time="2025-03-25T01:34:19.686259464Z" level=info msg="RemoveContainer for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\"" Mar 25 01:34:19.688451 kubelet[2918]: I0325 01:34:19.684603 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-var-lib-calico\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.688451 kubelet[2918]: I0325 01:34:19.684630 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/4fc43466-026d-4115-aa03-cae9bde3cb59-cni-bin-dir\") pod \"calico-node-m2l8g\" (UID: \"4fc43466-026d-4115-aa03-cae9bde3cb59\") " pod="calico-system/calico-node-m2l8g" Mar 25 01:34:19.688451 kubelet[2918]: I0325 01:34:19.684779 2918 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-log-dir\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.688451 kubelet[2918]: I0325 01:34:19.684804 2918 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-run-calico\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.688451 kubelet[2918]: I0325 01:34:19.684822 2918 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-var-lib-calico\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.698124 containerd[1528]: time="2025-03-25T01:34:19.698051336Z" level=info msg="RemoveContainer for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" returns successfully" Mar 25 01:34:19.709479 kubelet[2918]: I0325 01:34:19.708526 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.709479 kubelet[2918]: I0325 01:34:19.708603 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-policysync" (OuterVolumeSpecName: "policysync") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.709479 kubelet[2918]: I0325 01:34:19.708641 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.709752 kubelet[2918]: I0325 01:34:19.709481 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.709752 kubelet[2918]: I0325 01:34:19.709563 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.709752 kubelet[2918]: I0325 01:34:19.709617 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 25 01:34:19.717712 systemd[1]: var-lib-kubelet-pods-d3b4fc6b\x2d6a13\x2d424a\x2da46e\x2d4552d4b72a7a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmmxbk.mount: Deactivated successfully. Mar 25 01:34:19.725231 systemd[1]: var-lib-kubelet-pods-d3b4fc6b\x2d6a13\x2d424a\x2da46e\x2d4552d4b72a7a-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 25 01:34:19.731922 kubelet[2918]: I0325 01:34:19.731217 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-kube-api-access-mmxbk" (OuterVolumeSpecName: "kube-api-access-mmxbk") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "kube-api-access-mmxbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 25 01:34:19.732921 kubelet[2918]: I0325 01:34:19.732788 2918 scope.go:117] "RemoveContainer" containerID="00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908" Mar 25 01:34:19.735280 kubelet[2918]: I0325 01:34:19.735109 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-node-certs" (OuterVolumeSpecName: "node-certs") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 25 01:34:19.740462 containerd[1528]: time="2025-03-25T01:34:19.738773065Z" level=info msg="RemoveContainer for \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\"" Mar 25 01:34:19.742678 systemd[1]: var-lib-kubelet-pods-d3b4fc6b\x2d6a13\x2d424a\x2da46e\x2d4552d4b72a7a-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 25 01:34:19.746287 kubelet[2918]: I0325 01:34:19.746136 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" (UID: "d3b4fc6b-6a13-424a-a46e-4552d4b72a7a"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 25 01:34:19.756312 containerd[1528]: time="2025-03-25T01:34:19.755732031Z" level=info msg="RemoveContainer for \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" returns successfully" Mar 25 01:34:19.757824 kubelet[2918]: I0325 01:34:19.757787 2918 scope.go:117] "RemoveContainer" containerID="3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b" Mar 25 01:34:19.782657 containerd[1528]: time="2025-03-25T01:34:19.782465550Z" level=info msg="RemoveContainer for \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\"" Mar 25 01:34:19.786058 kubelet[2918]: I0325 01:34:19.786014 2918 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-bin-dir\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786064 2918 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-tigera-ca-bundle\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786090 2918 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-node-certs\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786107 2918 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-lib-modules\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786121 2918 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-cni-net-dir\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786135 2918 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-flexvol-driver-host\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786152 2918 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-mmxbk\" (UniqueName: \"kubernetes.io/projected/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-kube-api-access-mmxbk\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786168 2918 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-xtables-lock\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.786193 kubelet[2918]: I0325 01:34:19.786183 2918 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a-policysync\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:19.804303 containerd[1528]: time="2025-03-25T01:34:19.802811290Z" level=info msg="RemoveContainer for \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" returns successfully" Mar 25 01:34:19.805844 kubelet[2918]: I0325 01:34:19.805805 2918 scope.go:117] "RemoveContainer" containerID="bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b" Mar 25 01:34:19.813158 containerd[1528]: time="2025-03-25T01:34:19.813087435Z" level=error msg="ContainerStatus for \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\": not found" Mar 25 01:34:19.815422 kubelet[2918]: E0325 01:34:19.815381 2918 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\": not found" containerID="bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b" Mar 25 01:34:19.833369 kubelet[2918]: I0325 01:34:19.815459 2918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b"} err="failed to get container status \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\": rpc error: code = NotFound desc = an error occurred when try to find container \"bf6dce9e880d66f0063f3dd2e8ee68eb68f600664a546bc8159354deff33142b\": not found" Mar 25 01:34:19.833369 kubelet[2918]: I0325 01:34:19.833372 2918 scope.go:117] "RemoveContainer" containerID="00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908" Mar 25 01:34:19.834519 containerd[1528]: time="2025-03-25T01:34:19.833831500Z" level=error msg="ContainerStatus for \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\": not found" Mar 25 01:34:19.834642 kubelet[2918]: E0325 01:34:19.834039 2918 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\": not found" containerID="00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908" Mar 25 01:34:19.834642 kubelet[2918]: I0325 01:34:19.834076 2918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908"} err="failed to get container status \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\": rpc error: code = NotFound desc = an error occurred when try to find container \"00e36673fdf48600ed41e1dac284ecc890a6d467ea8a8e92103c2fd03d9f4908\": not found" Mar 25 01:34:19.834642 kubelet[2918]: I0325 01:34:19.834098 2918 scope.go:117] "RemoveContainer" containerID="3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b" Mar 25 01:34:19.835002 kubelet[2918]: E0325 01:34:19.834972 2918 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\": not found" containerID="3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b" Mar 25 01:34:19.835132 containerd[1528]: time="2025-03-25T01:34:19.834744935Z" level=error msg="ContainerStatus for \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\": not found" Mar 25 01:34:19.835207 kubelet[2918]: I0325 01:34:19.835003 2918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b"} err="failed to get container status \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\": rpc error: code = NotFound desc = an error occurred when try to find container \"3db766028874b78c87c8d04594994db7041c27d61f829a6e5e62248449013a9b\": not found" Mar 25 01:34:19.961011 containerd[1528]: time="2025-03-25T01:34:19.960876570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m2l8g,Uid:4fc43466-026d-4115-aa03-cae9bde3cb59,Namespace:calico-system,Attempt:0,}" Mar 25 01:34:20.014863 containerd[1528]: time="2025-03-25T01:34:20.012415924Z" level=info msg="connecting to shim 0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae" address="unix:///run/containerd/s/d9bc380a031926ff47781854a2c9979f34eb3dbc96c94dd3cd55118f7b1e4e7f" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:20.015860 systemd[1]: Removed slice kubepods-besteffort-podd3b4fc6b_6a13_424a_a46e_4552d4b72a7a.slice - libcontainer container kubepods-besteffort-podd3b4fc6b_6a13_424a_a46e_4552d4b72a7a.slice. Mar 25 01:34:20.016009 systemd[1]: kubepods-besteffort-podd3b4fc6b_6a13_424a_a46e_4552d4b72a7a.slice: Consumed 3.128s CPU time, 207.6M memory peak, 38.6M read from disk, 161M written to disk. Mar 25 01:34:20.081699 systemd[1]: Started cri-containerd-0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae.scope - libcontainer container 0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae. Mar 25 01:34:20.159200 containerd[1528]: time="2025-03-25T01:34:20.158658273Z" level=info msg="StopPodSandbox for \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\"" Mar 25 01:34:20.159200 containerd[1528]: time="2025-03-25T01:34:20.159032077Z" level=info msg="TearDown network for sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" successfully" Mar 25 01:34:20.159200 containerd[1528]: time="2025-03-25T01:34:20.159065094Z" level=info msg="StopPodSandbox for \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" returns successfully" Mar 25 01:34:20.162517 containerd[1528]: time="2025-03-25T01:34:20.162311421Z" level=info msg="RemovePodSandbox for \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\"" Mar 25 01:34:20.162917 containerd[1528]: time="2025-03-25T01:34:20.162399358Z" level=info msg="Forcibly stopping sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\"" Mar 25 01:34:20.162917 containerd[1528]: time="2025-03-25T01:34:20.162877771Z" level=info msg="TearDown network for sandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" successfully" Mar 25 01:34:20.172083 containerd[1528]: time="2025-03-25T01:34:20.171629325Z" level=info msg="Ensure that sandbox ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f in task-service has been cleanup successfully" Mar 25 01:34:20.176841 containerd[1528]: time="2025-03-25T01:34:20.176726354Z" level=info msg="RemovePodSandbox \"ed19b1b5786ea5ffba2e696b2ca0afac1ca82d70ed40073810dbd6355d29360f\" returns successfully" Mar 25 01:34:20.222326 containerd[1528]: time="2025-03-25T01:34:20.222260879Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-m2l8g,Uid:4fc43466-026d-4115-aa03-cae9bde3cb59,Namespace:calico-system,Attempt:0,} returns sandbox id \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\"" Mar 25 01:34:20.228264 containerd[1528]: time="2025-03-25T01:34:20.228226787Z" level=info msg="CreateContainer within sandbox \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:34:20.242284 systemd[1]: cri-containerd-9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e.scope: Deactivated successfully. Mar 25 01:34:20.242869 systemd[1]: cri-containerd-9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e.scope: Consumed 348ms CPU time, 32.8M memory peak, 24.5M read from disk. Mar 25 01:34:20.252776 containerd[1528]: time="2025-03-25T01:34:20.252716710Z" level=info msg="Container 0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:20.267898 containerd[1528]: time="2025-03-25T01:34:20.266933972Z" level=info msg="received exit event container_id:\"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" id:\"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" pid:3541 exit_status:1 exited_at:{seconds:1742866460 nanos:262610882}" Mar 25 01:34:20.268564 containerd[1528]: time="2025-03-25T01:34:20.267991843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" id:\"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" pid:3541 exit_status:1 exited_at:{seconds:1742866460 nanos:262610882}" Mar 25 01:34:20.268564 containerd[1528]: time="2025-03-25T01:34:20.268312072Z" level=info msg="CreateContainer within sandbox \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\"" Mar 25 01:34:20.272523 containerd[1528]: time="2025-03-25T01:34:20.271891298Z" level=info msg="StartContainer for \"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\"" Mar 25 01:34:20.276406 containerd[1528]: time="2025-03-25T01:34:20.276346384Z" level=info msg="connecting to shim 0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780" address="unix:///run/containerd/s/d9bc380a031926ff47781854a2c9979f34eb3dbc96c94dd3cd55118f7b1e4e7f" protocol=ttrpc version=3 Mar 25 01:34:20.328939 systemd[1]: Started cri-containerd-0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780.scope - libcontainer container 0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780. Mar 25 01:34:20.457239 containerd[1528]: time="2025-03-25T01:34:20.456642628Z" level=info msg="StopContainer for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" returns successfully" Mar 25 01:34:20.458210 containerd[1528]: time="2025-03-25T01:34:20.458113267Z" level=info msg="StopPodSandbox for \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\"" Mar 25 01:34:20.458585 containerd[1528]: time="2025-03-25T01:34:20.458217075Z" level=info msg="Container to stop \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 01:34:20.488651 systemd[1]: cri-containerd-3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b.scope: Deactivated successfully. Mar 25 01:34:20.495689 containerd[1528]: time="2025-03-25T01:34:20.495523328Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" id:\"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" pid:3379 exit_status:137 exited_at:{seconds:1742866460 nanos:493081109}" Mar 25 01:34:20.522905 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e-rootfs.mount: Deactivated successfully. Mar 25 01:34:20.585687 containerd[1528]: time="2025-03-25T01:34:20.585090447Z" level=info msg="received exit event sandbox_id:\"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" exit_status:137 exited_at:{seconds:1742866460 nanos:493081109}" Mar 25 01:34:20.586409 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b-rootfs.mount: Deactivated successfully. Mar 25 01:34:20.588987 containerd[1528]: time="2025-03-25T01:34:20.588843004Z" level=info msg="shim disconnected" id=3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b namespace=k8s.io Mar 25 01:34:20.588987 containerd[1528]: time="2025-03-25T01:34:20.588874892Z" level=warning msg="cleaning up after shim disconnected" id=3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b namespace=k8s.io Mar 25 01:34:20.588987 containerd[1528]: time="2025-03-25T01:34:20.588888849Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 25 01:34:20.595904 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b-shm.mount: Deactivated successfully. Mar 25 01:34:20.599225 containerd[1528]: time="2025-03-25T01:34:20.597306178Z" level=info msg="TearDown network for sandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" successfully" Mar 25 01:34:20.599225 containerd[1528]: time="2025-03-25T01:34:20.598827907Z" level=info msg="StopPodSandbox for \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" returns successfully" Mar 25 01:34:20.697921 kubelet[2918]: I0325 01:34:20.697862 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-typha-certs\") pod \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\" (UID: \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\") " Mar 25 01:34:20.700238 kubelet[2918]: I0325 01:34:20.697947 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-tigera-ca-bundle\") pod \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\" (UID: \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\") " Mar 25 01:34:20.700238 kubelet[2918]: I0325 01:34:20.698012 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4x79\" (UniqueName: \"kubernetes.io/projected/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-kube-api-access-p4x79\") pod \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\" (UID: \"fd871a25-7d2c-4fb4-8049-0fd8f241ab9c\") " Mar 25 01:34:20.714930 containerd[1528]: time="2025-03-25T01:34:20.714672448Z" level=info msg="StartContainer for \"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\" returns successfully" Mar 25 01:34:20.720912 kubelet[2918]: I0325 01:34:20.720832 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-kube-api-access-p4x79" (OuterVolumeSpecName: "kube-api-access-p4x79") pod "fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" (UID: "fd871a25-7d2c-4fb4-8049-0fd8f241ab9c"). InnerVolumeSpecName "kube-api-access-p4x79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 25 01:34:20.721906 systemd[1]: var-lib-kubelet-pods-fd871a25\x2d7d2c\x2d4fb4\x2d8049\x2d0fd8f241ab9c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dp4x79.mount: Deactivated successfully. Mar 25 01:34:20.722071 systemd[1]: var-lib-kubelet-pods-fd871a25\x2d7d2c\x2d4fb4\x2d8049\x2d0fd8f241ab9c-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 25 01:34:20.724290 kubelet[2918]: I0325 01:34:20.722738 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" (UID: "fd871a25-7d2c-4fb4-8049-0fd8f241ab9c"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 25 01:34:20.736864 systemd[1]: var-lib-kubelet-pods-fd871a25\x2d7d2c\x2d4fb4\x2d8049\x2d0fd8f241ab9c-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 25 01:34:20.742815 kubelet[2918]: I0325 01:34:20.742272 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" (UID: "fd871a25-7d2c-4fb4-8049-0fd8f241ab9c"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 25 01:34:20.748102 kubelet[2918]: I0325 01:34:20.747145 2918 scope.go:117] "RemoveContainer" containerID="9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e" Mar 25 01:34:20.759333 systemd[1]: Removed slice kubepods-besteffort-podfd871a25_7d2c_4fb4_8049_0fd8f241ab9c.slice - libcontainer container kubepods-besteffort-podfd871a25_7d2c_4fb4_8049_0fd8f241ab9c.slice. Mar 25 01:34:20.761991 containerd[1528]: time="2025-03-25T01:34:20.760897452Z" level=info msg="RemoveContainer for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\"" Mar 25 01:34:20.759515 systemd[1]: kubepods-besteffort-podfd871a25_7d2c_4fb4_8049_0fd8f241ab9c.slice: Consumed 398ms CPU time, 33.1M memory peak, 24.5M read from disk. Mar 25 01:34:20.796166 containerd[1528]: time="2025-03-25T01:34:20.795799353Z" level=info msg="RemoveContainer for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" returns successfully" Mar 25 01:34:20.800758 kubelet[2918]: I0325 01:34:20.800089 2918 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-tigera-ca-bundle\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:20.800758 kubelet[2918]: I0325 01:34:20.800192 2918 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-p4x79\" (UniqueName: \"kubernetes.io/projected/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-kube-api-access-p4x79\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:20.800758 kubelet[2918]: I0325 01:34:20.800213 2918 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c-typha-certs\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:20.801904 kubelet[2918]: I0325 01:34:20.801554 2918 scope.go:117] "RemoveContainer" containerID="9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e" Mar 25 01:34:20.804499 containerd[1528]: time="2025-03-25T01:34:20.803731232Z" level=error msg="ContainerStatus for \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\": not found" Mar 25 01:34:20.804499 containerd[1528]: time="2025-03-25T01:34:20.803890612Z" level=info msg="StopContainer for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" with timeout 30 (s)" Mar 25 01:34:20.804656 kubelet[2918]: E0325 01:34:20.804617 2918 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\": not found" containerID="9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e" Mar 25 01:34:20.804747 kubelet[2918]: I0325 01:34:20.804680 2918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e"} err="failed to get container status \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\": rpc error: code = NotFound desc = an error occurred when try to find container \"9b1b5982cbb3fdb84ba1a269e093eee10cb4a64efa4af35b0ba20be5df9eb74e\": not found" Mar 25 01:34:20.810917 containerd[1528]: time="2025-03-25T01:34:20.810709475Z" level=info msg="Stop container \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" with signal terminated" Mar 25 01:34:20.886257 systemd[1]: cri-containerd-146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea.scope: Deactivated successfully. Mar 25 01:34:20.909803 containerd[1528]: time="2025-03-25T01:34:20.909717472Z" level=info msg="received exit event container_id:\"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" id:\"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" pid:4770 exit_status:2 exited_at:{seconds:1742866460 nanos:904796872}" Mar 25 01:34:20.912937 containerd[1528]: time="2025-03-25T01:34:20.912610727Z" level=info msg="TaskExit event in podsandbox handler container_id:\"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" id:\"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" pid:4770 exit_status:2 exited_at:{seconds:1742866460 nanos:904796872}" Mar 25 01:34:21.091668 systemd[1]: cri-containerd-0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780.scope: Deactivated successfully. Mar 25 01:34:21.092081 systemd[1]: cri-containerd-0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780.scope: Consumed 78ms CPU time, 19.1M memory peak, 11.2M read from disk, 6.3M written to disk. Mar 25 01:34:21.102029 containerd[1528]: time="2025-03-25T01:34:21.101962238Z" level=info msg="StopContainer for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" returns successfully" Mar 25 01:34:21.104905 containerd[1528]: time="2025-03-25T01:34:21.104413359Z" level=info msg="StopPodSandbox for \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\"" Mar 25 01:34:21.104905 containerd[1528]: time="2025-03-25T01:34:21.104562933Z" level=info msg="Container to stop \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 01:34:21.106773 containerd[1528]: time="2025-03-25T01:34:21.106740066Z" level=info msg="received exit event container_id:\"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\" id:\"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\" pid:5036 exited_at:{seconds:1742866461 nanos:105157550}" Mar 25 01:34:21.111383 containerd[1528]: time="2025-03-25T01:34:21.109700047Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\" id:\"0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780\" pid:5036 exited_at:{seconds:1742866461 nanos:105157550}" Mar 25 01:34:21.142164 systemd[1]: cri-containerd-249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe.scope: Deactivated successfully. Mar 25 01:34:21.151027 containerd[1528]: time="2025-03-25T01:34:21.150767412Z" level=info msg="TaskExit event in podsandbox handler container_id:\"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" id:\"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" pid:4592 exit_status:137 exited_at:{seconds:1742866461 nanos:150196468}" Mar 25 01:34:21.263651 containerd[1528]: time="2025-03-25T01:34:21.262511747Z" level=info msg="shim disconnected" id=249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe namespace=k8s.io Mar 25 01:34:21.263651 containerd[1528]: time="2025-03-25T01:34:21.263598946Z" level=warning msg="cleaning up after shim disconnected" id=249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe namespace=k8s.io Mar 25 01:34:21.264561 containerd[1528]: time="2025-03-25T01:34:21.263615541Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 25 01:34:21.313998 containerd[1528]: time="2025-03-25T01:34:21.313854328Z" level=info msg="received exit event sandbox_id:\"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" exit_status:137 exited_at:{seconds:1742866461 nanos:150196468}" Mar 25 01:34:21.512819 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0eb83f7e4552a7f4bec9b526df35e01010054b3adb8592df4f2ac3f5236d0780-rootfs.mount: Deactivated successfully. Mar 25 01:34:21.512996 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea-rootfs.mount: Deactivated successfully. Mar 25 01:34:21.513119 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe-rootfs.mount: Deactivated successfully. Mar 25 01:34:21.513229 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe-shm.mount: Deactivated successfully. Mar 25 01:34:21.559927 containerd[1528]: time="2025-03-25T01:34:21.559145220Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:21.563285 containerd[1528]: time="2025-03-25T01:34:21.563190785Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 25 01:34:21.565472 containerd[1528]: time="2025-03-25T01:34:21.564818451Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:21.569559 containerd[1528]: time="2025-03-25T01:34:21.569421168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:34:21.571813 containerd[1528]: time="2025-03-25T01:34:21.571152855Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.694854914s" Mar 25 01:34:21.571813 containerd[1528]: time="2025-03-25T01:34:21.571199283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 25 01:34:21.579541 containerd[1528]: time="2025-03-25T01:34:21.579161394Z" level=info msg="CreateContainer within sandbox \"b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:34:21.589207 containerd[1528]: time="2025-03-25T01:34:21.589165641Z" level=info msg="Container cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:21.604899 containerd[1528]: time="2025-03-25T01:34:21.604803309Z" level=info msg="CreateContainer within sandbox \"b1230486f0161fdfd8d7fb6904c29a8fbfea39f8851562a9b9d19afcfa3738d0\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6\"" Mar 25 01:34:21.606367 containerd[1528]: time="2025-03-25T01:34:21.605903114Z" level=info msg="StartContainer for \"cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6\"" Mar 25 01:34:21.608060 containerd[1528]: time="2025-03-25T01:34:21.608030752Z" level=info msg="connecting to shim cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6" address="unix:///run/containerd/s/8ccc0b83cef0841e4d69b18d3dc87596ec5864825be57a0237c7d10a5136d8ad" protocol=ttrpc version=3 Mar 25 01:34:21.629578 systemd-networkd[1432]: calib44b6509e82: Link DOWN Mar 25 01:34:21.630401 systemd-networkd[1432]: calib44b6509e82: Lost carrier Mar 25 01:34:21.673189 systemd[1]: Started cri-containerd-cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6.scope - libcontainer container cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6. Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.615 [INFO][5192] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.620 [INFO][5192] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" iface="eth0" netns="/var/run/netns/cni-7edc6d77-989b-c7cf-62ac-cc5f131cd18e" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.625 [INFO][5192] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" iface="eth0" netns="/var/run/netns/cni-7edc6d77-989b-c7cf-62ac-cc5f131cd18e" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.633 [INFO][5192] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" after=11.200975ms iface="eth0" netns="/var/run/netns/cni-7edc6d77-989b-c7cf-62ac-cc5f131cd18e" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.633 [INFO][5192] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.633 [INFO][5192] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.729 [INFO][5205] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.729 [INFO][5205] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.730 [INFO][5205] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.796 [INFO][5205] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.796 [INFO][5205] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.798 [INFO][5205] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:21.809770 containerd[1528]: 2025-03-25 01:34:21.806 [INFO][5192] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:34:21.811607 containerd[1528]: time="2025-03-25T01:34:21.811558826Z" level=info msg="TearDown network for sandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" successfully" Mar 25 01:34:21.811819 containerd[1528]: time="2025-03-25T01:34:21.811714752Z" level=info msg="StopPodSandbox for \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" returns successfully" Mar 25 01:34:21.819873 systemd[1]: run-netns-cni\x2d7edc6d77\x2d989b\x2dc7cf\x2d62ac\x2dcc5f131cd18e.mount: Deactivated successfully. Mar 25 01:34:21.828752 containerd[1528]: time="2025-03-25T01:34:21.828672062Z" level=info msg="CreateContainer within sandbox \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:34:21.845054 kubelet[2918]: I0325 01:34:21.844972 2918 scope.go:117] "RemoveContainer" containerID="146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea" Mar 25 01:34:21.855700 containerd[1528]: time="2025-03-25T01:34:21.853646216Z" level=info msg="Container 1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:21.864687 containerd[1528]: time="2025-03-25T01:34:21.864645635Z" level=info msg="RemoveContainer for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\"" Mar 25 01:34:21.868872 containerd[1528]: time="2025-03-25T01:34:21.868839101Z" level=info msg="StartContainer for \"cd343fc5fa9ba10ea3b3a49c95c991e50a5d7701e2c4e9a4080d8e9d494e36b6\" returns successfully" Mar 25 01:34:21.873281 containerd[1528]: time="2025-03-25T01:34:21.873055375Z" level=info msg="RemoveContainer for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" returns successfully" Mar 25 01:34:21.873773 kubelet[2918]: I0325 01:34:21.873618 2918 scope.go:117] "RemoveContainer" containerID="146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea" Mar 25 01:34:21.874141 containerd[1528]: time="2025-03-25T01:34:21.874083177Z" level=error msg="ContainerStatus for \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\": not found" Mar 25 01:34:21.874446 kubelet[2918]: E0325 01:34:21.874353 2918 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\": not found" containerID="146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea" Mar 25 01:34:21.874446 kubelet[2918]: I0325 01:34:21.874395 2918 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea"} err="failed to get container status \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\": rpc error: code = NotFound desc = an error occurred when try to find container \"146bb7ebd56b750f3843e2070fc66f2a91fa2096a960bb77a690bd08e1bdb5ea\": not found" Mar 25 01:34:21.879120 containerd[1528]: time="2025-03-25T01:34:21.879065534Z" level=info msg="CreateContainer within sandbox \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\"" Mar 25 01:34:21.880563 containerd[1528]: time="2025-03-25T01:34:21.880511110Z" level=info msg="StartContainer for \"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\"" Mar 25 01:34:21.886674 containerd[1528]: time="2025-03-25T01:34:21.886610581Z" level=info msg="connecting to shim 1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347" address="unix:///run/containerd/s/d9bc380a031926ff47781854a2c9979f34eb3dbc96c94dd3cd55118f7b1e4e7f" protocol=ttrpc version=3 Mar 25 01:34:21.933603 kubelet[2918]: I0325 01:34:21.932567 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kzwn\" (UniqueName: \"kubernetes.io/projected/6685e276-da04-413c-850f-cc7ff0494084-kube-api-access-6kzwn\") pod \"6685e276-da04-413c-850f-cc7ff0494084\" (UID: \"6685e276-da04-413c-850f-cc7ff0494084\") " Mar 25 01:34:21.933603 kubelet[2918]: I0325 01:34:21.932624 2918 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6685e276-da04-413c-850f-cc7ff0494084-tigera-ca-bundle\") pod \"6685e276-da04-413c-850f-cc7ff0494084\" (UID: \"6685e276-da04-413c-850f-cc7ff0494084\") " Mar 25 01:34:21.937618 systemd[1]: Started cri-containerd-1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347.scope - libcontainer container 1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347. Mar 25 01:34:21.951497 kubelet[2918]: I0325 01:34:21.949567 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6685e276-da04-413c-850f-cc7ff0494084-kube-api-access-6kzwn" (OuterVolumeSpecName: "kube-api-access-6kzwn") pod "6685e276-da04-413c-850f-cc7ff0494084" (UID: "6685e276-da04-413c-850f-cc7ff0494084"). InnerVolumeSpecName "kube-api-access-6kzwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 25 01:34:21.953033 kubelet[2918]: I0325 01:34:21.952989 2918 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6685e276-da04-413c-850f-cc7ff0494084-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "6685e276-da04-413c-850f-cc7ff0494084" (UID: "6685e276-da04-413c-850f-cc7ff0494084"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 25 01:34:22.033700 kubelet[2918]: I0325 01:34:22.033589 2918 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-6kzwn\" (UniqueName: \"kubernetes.io/projected/6685e276-da04-413c-850f-cc7ff0494084-kube-api-access-6kzwn\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:22.033700 kubelet[2918]: I0325 01:34:22.033640 2918 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6685e276-da04-413c-850f-cc7ff0494084-tigera-ca-bundle\") on node \"srv-y0b1r.gb1.brightbox.com\" DevicePath \"\"" Mar 25 01:34:22.050851 kubelet[2918]: I0325 01:34:22.049664 2918 topology_manager.go:215] "Topology Admit Handler" podUID="056db95f-5250-4070-acd4-1abd5a830a83" podNamespace="calico-system" podName="calico-typha-b85658cf8-5ptbz" Mar 25 01:34:22.050851 kubelet[2918]: E0325 01:34:22.049846 2918 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" containerName="calico-typha" Mar 25 01:34:22.050851 kubelet[2918]: E0325 01:34:22.050397 2918 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6685e276-da04-413c-850f-cc7ff0494084" containerName="calico-kube-controllers" Mar 25 01:34:22.050851 kubelet[2918]: I0325 01:34:22.050764 2918 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" containerName="calico-typha" Mar 25 01:34:22.050851 kubelet[2918]: I0325 01:34:22.050787 2918 memory_manager.go:354] "RemoveStaleState removing state" podUID="6685e276-da04-413c-850f-cc7ff0494084" containerName="calico-kube-controllers" Mar 25 01:34:22.067029 systemd[1]: Created slice kubepods-besteffort-pod056db95f_5250_4070_acd4_1abd5a830a83.slice - libcontainer container kubepods-besteffort-pod056db95f_5250_4070_acd4_1abd5a830a83.slice. Mar 25 01:34:22.102739 kubelet[2918]: I0325 01:34:22.102663 2918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b4fc6b-6a13-424a-a46e-4552d4b72a7a" path="/var/lib/kubelet/pods/d3b4fc6b-6a13-424a-a46e-4552d4b72a7a/volumes" Mar 25 01:34:22.106056 kubelet[2918]: I0325 01:34:22.106025 2918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd871a25-7d2c-4fb4-8049-0fd8f241ab9c" path="/var/lib/kubelet/pods/fd871a25-7d2c-4fb4-8049-0fd8f241ab9c/volumes" Mar 25 01:34:22.122573 systemd[1]: Removed slice kubepods-besteffort-pod6685e276_da04_413c_850f_cc7ff0494084.slice - libcontainer container kubepods-besteffort-pod6685e276_da04_413c_850f_cc7ff0494084.slice. Mar 25 01:34:22.134489 kubelet[2918]: I0325 01:34:22.134418 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9lw\" (UniqueName: \"kubernetes.io/projected/056db95f-5250-4070-acd4-1abd5a830a83-kube-api-access-8j9lw\") pod \"calico-typha-b85658cf8-5ptbz\" (UID: \"056db95f-5250-4070-acd4-1abd5a830a83\") " pod="calico-system/calico-typha-b85658cf8-5ptbz" Mar 25 01:34:22.134910 kubelet[2918]: I0325 01:34:22.134675 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/056db95f-5250-4070-acd4-1abd5a830a83-tigera-ca-bundle\") pod \"calico-typha-b85658cf8-5ptbz\" (UID: \"056db95f-5250-4070-acd4-1abd5a830a83\") " pod="calico-system/calico-typha-b85658cf8-5ptbz" Mar 25 01:34:22.134910 kubelet[2918]: I0325 01:34:22.134746 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/056db95f-5250-4070-acd4-1abd5a830a83-typha-certs\") pod \"calico-typha-b85658cf8-5ptbz\" (UID: \"056db95f-5250-4070-acd4-1abd5a830a83\") " pod="calico-system/calico-typha-b85658cf8-5ptbz" Mar 25 01:34:22.209611 containerd[1528]: time="2025-03-25T01:34:22.209565482Z" level=info msg="StartContainer for \"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\" returns successfully" Mar 25 01:34:22.382418 containerd[1528]: time="2025-03-25T01:34:22.381808756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b85658cf8-5ptbz,Uid:056db95f-5250-4070-acd4-1abd5a830a83,Namespace:calico-system,Attempt:0,}" Mar 25 01:34:22.426946 containerd[1528]: time="2025-03-25T01:34:22.426764650Z" level=info msg="connecting to shim cac0d83fcd4b8dc846b59d3b6193849c2ec4f4b2cb97c97c8eaf3f5b694be23d" address="unix:///run/containerd/s/4424ea4be07df011e2b5faa8fef76860ec9525bc3a023de21a7b0d90da53e58c" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:22.494778 systemd[1]: Started cri-containerd-cac0d83fcd4b8dc846b59d3b6193849c2ec4f4b2cb97c97c8eaf3f5b694be23d.scope - libcontainer container cac0d83fcd4b8dc846b59d3b6193849c2ec4f4b2cb97c97c8eaf3f5b694be23d. Mar 25 01:34:22.520512 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1330043159.mount: Deactivated successfully. Mar 25 01:34:22.520681 systemd[1]: var-lib-kubelet-pods-6685e276\x2dda04\x2d413c\x2d850f\x2dcc7ff0494084-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 25 01:34:22.520833 systemd[1]: var-lib-kubelet-pods-6685e276\x2dda04\x2d413c\x2d850f\x2dcc7ff0494084-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6kzwn.mount: Deactivated successfully. Mar 25 01:34:22.528039 kubelet[2918]: I0325 01:34:22.527264 2918 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:34:22.529358 kubelet[2918]: I0325 01:34:22.529223 2918 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:34:22.709110 containerd[1528]: time="2025-03-25T01:34:22.708956393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b85658cf8-5ptbz,Uid:056db95f-5250-4070-acd4-1abd5a830a83,Namespace:calico-system,Attempt:0,} returns sandbox id \"cac0d83fcd4b8dc846b59d3b6193849c2ec4f4b2cb97c97c8eaf3f5b694be23d\"" Mar 25 01:34:22.738520 containerd[1528]: time="2025-03-25T01:34:22.733404895Z" level=info msg="CreateContainer within sandbox \"cac0d83fcd4b8dc846b59d3b6193849c2ec4f4b2cb97c97c8eaf3f5b694be23d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:34:22.748941 containerd[1528]: time="2025-03-25T01:34:22.748898644Z" level=info msg="Container d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:22.765898 containerd[1528]: time="2025-03-25T01:34:22.765850701Z" level=info msg="CreateContainer within sandbox \"cac0d83fcd4b8dc846b59d3b6193849c2ec4f4b2cb97c97c8eaf3f5b694be23d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515\"" Mar 25 01:34:22.767857 containerd[1528]: time="2025-03-25T01:34:22.767789383Z" level=info msg="StartContainer for \"d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515\"" Mar 25 01:34:22.770680 containerd[1528]: time="2025-03-25T01:34:22.770643341Z" level=info msg="connecting to shim d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515" address="unix:///run/containerd/s/4424ea4be07df011e2b5faa8fef76860ec9525bc3a023de21a7b0d90da53e58c" protocol=ttrpc version=3 Mar 25 01:34:22.809704 systemd[1]: Started cri-containerd-d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515.scope - libcontainer container d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515. Mar 25 01:34:22.888431 kubelet[2918]: I0325 01:34:22.888350 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zwppf" podStartSLOduration=30.851571371 podStartE2EDuration="40.888318764s" podCreationTimestamp="2025-03-25 01:33:42 +0000 UTC" firstStartedPulling="2025-03-25 01:34:11.538228748 +0000 UTC m=+51.690838898" lastFinishedPulling="2025-03-25 01:34:21.574976129 +0000 UTC m=+61.727586291" observedRunningTime="2025-03-25 01:34:22.88768238 +0000 UTC m=+63.040292576" watchObservedRunningTime="2025-03-25 01:34:22.888318764 +0000 UTC m=+63.040928920" Mar 25 01:34:22.992706 containerd[1528]: time="2025-03-25T01:34:22.991074371Z" level=info msg="StartContainer for \"d15200ad0f992da17bce604f261440c2c43fd734d8a32b0ed430d97ea618c515\" returns successfully" Mar 25 01:34:23.966070 kubelet[2918]: I0325 01:34:23.965980 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-b85658cf8-5ptbz" podStartSLOduration=5.965952049 podStartE2EDuration="5.965952049s" podCreationTimestamp="2025-03-25 01:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:34:23.930497159 +0000 UTC m=+64.083107326" watchObservedRunningTime="2025-03-25 01:34:23.965952049 +0000 UTC m=+64.118562219" Mar 25 01:34:24.114266 kubelet[2918]: I0325 01:34:24.108840 2918 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6685e276-da04-413c-850f-cc7ff0494084" path="/var/lib/kubelet/pods/6685e276-da04-413c-850f-cc7ff0494084/volumes" Mar 25 01:34:24.160759 systemd[1]: Started sshd@16-10.243.75.178:22-218.92.0.225:45354.service - OpenSSH per-connection server daemon (218.92.0.225:45354). Mar 25 01:34:24.788005 systemd[1]: cri-containerd-1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347.scope: Deactivated successfully. Mar 25 01:34:24.788544 systemd[1]: cri-containerd-1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347.scope: Consumed 1.200s CPU time, 262.2M memory peak, 293.2M read from disk. Mar 25 01:34:24.792350 containerd[1528]: time="2025-03-25T01:34:24.792298581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\" id:\"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\" pid:5264 exited_at:{seconds:1742866464 nanos:791695597}" Mar 25 01:34:24.794058 containerd[1528]: time="2025-03-25T01:34:24.792394922Z" level=info msg="received exit event container_id:\"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\" id:\"1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347\" pid:5264 exited_at:{seconds:1742866464 nanos:791695597}" Mar 25 01:34:24.833534 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1dae559a703f38a3457767ccfb99ff36a50e1920a3004bd586e8390837bc2347-rootfs.mount: Deactivated successfully. Mar 25 01:34:24.946471 containerd[1528]: time="2025-03-25T01:34:24.945392711Z" level=info msg="CreateContainer within sandbox \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:34:24.990850 containerd[1528]: time="2025-03-25T01:34:24.990805772Z" level=info msg="Container 524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:25.001936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2952347741.mount: Deactivated successfully. Mar 25 01:34:25.016468 containerd[1528]: time="2025-03-25T01:34:25.016409290Z" level=info msg="CreateContainer within sandbox \"0de677804933a3a0a60b53937d9d7d88ce9ae0d926f217d339f3052db8f2b6ae\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\"" Mar 25 01:34:25.019722 containerd[1528]: time="2025-03-25T01:34:25.019675111Z" level=info msg="StartContainer for \"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\"" Mar 25 01:34:25.024222 containerd[1528]: time="2025-03-25T01:34:25.024162080Z" level=info msg="connecting to shim 524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd" address="unix:///run/containerd/s/d9bc380a031926ff47781854a2c9979f34eb3dbc96c94dd3cd55118f7b1e4e7f" protocol=ttrpc version=3 Mar 25 01:34:25.055715 systemd[1]: Started cri-containerd-524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd.scope - libcontainer container 524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd. Mar 25 01:34:25.154181 containerd[1528]: time="2025-03-25T01:34:25.153973025Z" level=info msg="StartContainer for \"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\" returns successfully" Mar 25 01:34:25.462477 kubelet[2918]: I0325 01:34:25.460714 2918 topology_manager.go:215] "Topology Admit Handler" podUID="6fa78ed4-94da-4218-a4fe-64be664b2054" podNamespace="calico-system" podName="calico-kube-controllers-76f4756b5c-lc2zl" Mar 25 01:34:25.479078 systemd[1]: Created slice kubepods-besteffort-pod6fa78ed4_94da_4218_a4fe_64be664b2054.slice - libcontainer container kubepods-besteffort-pod6fa78ed4_94da_4218_a4fe_64be664b2054.slice. Mar 25 01:34:25.565004 kubelet[2918]: I0325 01:34:25.564935 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6fa78ed4-94da-4218-a4fe-64be664b2054-tigera-ca-bundle\") pod \"calico-kube-controllers-76f4756b5c-lc2zl\" (UID: \"6fa78ed4-94da-4218-a4fe-64be664b2054\") " pod="calico-system/calico-kube-controllers-76f4756b5c-lc2zl" Mar 25 01:34:25.565207 kubelet[2918]: I0325 01:34:25.565026 2918 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knmcz\" (UniqueName: \"kubernetes.io/projected/6fa78ed4-94da-4218-a4fe-64be664b2054-kube-api-access-knmcz\") pod \"calico-kube-controllers-76f4756b5c-lc2zl\" (UID: \"6fa78ed4-94da-4218-a4fe-64be664b2054\") " pod="calico-system/calico-kube-controllers-76f4756b5c-lc2zl" Mar 25 01:34:25.792388 containerd[1528]: time="2025-03-25T01:34:25.791939381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76f4756b5c-lc2zl,Uid:6fa78ed4-94da-4218-a4fe-64be664b2054,Namespace:calico-system,Attempt:0,}" Mar 25 01:34:25.998010 kubelet[2918]: I0325 01:34:25.996081 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-m2l8g" podStartSLOduration=6.996002977 podStartE2EDuration="6.996002977s" podCreationTimestamp="2025-03-25 01:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:34:25.995260052 +0000 UTC m=+66.147870227" watchObservedRunningTime="2025-03-25 01:34:25.996002977 +0000 UTC m=+66.148613138" Mar 25 01:34:26.087790 sshd-session[5419]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:26.198146 systemd-networkd[1432]: cali58b5f99b4cf: Link UP Mar 25 01:34:26.204278 systemd-networkd[1432]: cali58b5f99b4cf: Gained carrier Mar 25 01:34:26.233510 containerd[1528]: time="2025-03-25T01:34:26.233179024Z" level=info msg="TaskExit event in podsandbox handler container_id:\"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\" id:\"c62ddd78ac47fe290ce7c4c04751477fb60737f3fc8b87d4c0f107c568c309f1\" pid:5468 exit_status:1 exited_at:{seconds:1742866466 nanos:223847467}" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:25.946 [INFO][5444] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0 calico-kube-controllers-76f4756b5c- calico-system 6fa78ed4-94da-4218-a4fe-64be664b2054 1036 0 2025-03-25 01:34:22 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:76f4756b5c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-y0b1r.gb1.brightbox.com calico-kube-controllers-76f4756b5c-lc2zl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali58b5f99b4cf [] []}} ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:25.947 [INFO][5444] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.060 [INFO][5452] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" HandleID="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.091 [INFO][5452] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" HandleID="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00025f390), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-y0b1r.gb1.brightbox.com", "pod":"calico-kube-controllers-76f4756b5c-lc2zl", "timestamp":"2025-03-25 01:34:26.060471264 +0000 UTC"}, Hostname:"srv-y0b1r.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.091 [INFO][5452] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.091 [INFO][5452] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.091 [INFO][5452] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-y0b1r.gb1.brightbox.com' Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.096 [INFO][5452] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.116 [INFO][5452] ipam/ipam.go 372: Looking up existing affinities for host host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.131 [INFO][5452] ipam/ipam.go 489: Trying affinity for 192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.138 [INFO][5452] ipam/ipam.go 155: Attempting to load block cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.148 [INFO][5452] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.149 [INFO][5452] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.152 [INFO][5452] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966 Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.165 [INFO][5452] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.176 [INFO][5452] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.91.71/26] block=192.168.91.64/26 handle="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.177 [INFO][5452] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.91.71/26] handle="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" host="srv-y0b1r.gb1.brightbox.com" Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.177 [INFO][5452] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:34:26.241321 containerd[1528]: 2025-03-25 01:34:26.177 [INFO][5452] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.71/26] IPv6=[] ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" HandleID="k8s-pod-network.5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.245631 containerd[1528]: 2025-03-25 01:34:26.182 [INFO][5444] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0", GenerateName:"calico-kube-controllers-76f4756b5c-", Namespace:"calico-system", SelfLink:"", UID:"6fa78ed4-94da-4218-a4fe-64be664b2054", ResourceVersion:"1036", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 34, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76f4756b5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-76f4756b5c-lc2zl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali58b5f99b4cf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:26.245631 containerd[1528]: 2025-03-25 01:34:26.182 [INFO][5444] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.91.71/32] ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.245631 containerd[1528]: 2025-03-25 01:34:26.182 [INFO][5444] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali58b5f99b4cf ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.245631 containerd[1528]: 2025-03-25 01:34:26.203 [INFO][5444] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.245631 containerd[1528]: 2025-03-25 01:34:26.205 [INFO][5444] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0", GenerateName:"calico-kube-controllers-76f4756b5c-", Namespace:"calico-system", SelfLink:"", UID:"6fa78ed4-94da-4218-a4fe-64be664b2054", ResourceVersion:"1036", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 34, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76f4756b5c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-y0b1r.gb1.brightbox.com", ContainerID:"5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966", Pod:"calico-kube-controllers-76f4756b5c-lc2zl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali58b5f99b4cf", MAC:"12:88:0a:bb:c3:62", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:34:26.245631 containerd[1528]: 2025-03-25 01:34:26.230 [INFO][5444] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" Namespace="calico-system" Pod="calico-kube-controllers-76f4756b5c-lc2zl" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--76f4756b5c--lc2zl-eth0" Mar 25 01:34:26.326260 containerd[1528]: time="2025-03-25T01:34:26.325307835Z" level=info msg="connecting to shim 5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966" address="unix:///run/containerd/s/9919809a93b4c2cec06cf081ed0049544aa0b4cc28aeb0cdcd86caad44d1a43c" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:34:26.379660 systemd[1]: Started cri-containerd-5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966.scope - libcontainer container 5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966. Mar 25 01:34:26.516885 containerd[1528]: time="2025-03-25T01:34:26.516739327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76f4756b5c-lc2zl,Uid:6fa78ed4-94da-4218-a4fe-64be664b2054,Namespace:calico-system,Attempt:0,} returns sandbox id \"5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966\"" Mar 25 01:34:26.540135 containerd[1528]: time="2025-03-25T01:34:26.539917185Z" level=info msg="CreateContainer within sandbox \"5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:34:26.557363 containerd[1528]: time="2025-03-25T01:34:26.557160031Z" level=info msg="Container f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:26.565003 containerd[1528]: time="2025-03-25T01:34:26.564812941Z" level=info msg="CreateContainer within sandbox \"5eab7a0f7d3fa9c66401118698331bf2daaf648df8e2dec60b3e291375777966\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\"" Mar 25 01:34:26.566678 containerd[1528]: time="2025-03-25T01:34:26.565602080Z" level=info msg="StartContainer for \"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\"" Mar 25 01:34:26.567195 containerd[1528]: time="2025-03-25T01:34:26.567162689Z" level=info msg="connecting to shim f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c" address="unix:///run/containerd/s/9919809a93b4c2cec06cf081ed0049544aa0b4cc28aeb0cdcd86caad44d1a43c" protocol=ttrpc version=3 Mar 25 01:34:26.595647 systemd[1]: Started cri-containerd-f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c.scope - libcontainer container f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c. Mar 25 01:34:26.683067 containerd[1528]: time="2025-03-25T01:34:26.683009305Z" level=info msg="StartContainer for \"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" returns successfully" Mar 25 01:34:27.231839 containerd[1528]: time="2025-03-25T01:34:27.231770091Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" id:\"01d5d70d492f48af2752e5627f87dbd5a3cce68fe28abb6c9bc5e25ab28ddb7b\" pid:5682 exit_status:1 exited_at:{seconds:1742866467 nanos:228506965}" Mar 25 01:34:27.558844 systemd-networkd[1432]: cali58b5f99b4cf: Gained IPv6LL Mar 25 01:34:27.622202 containerd[1528]: time="2025-03-25T01:34:27.621931019Z" level=info msg="TaskExit event in podsandbox handler container_id:\"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\" id:\"a7db7d976530fd8c320d31fb4351b7b8aa39c38aee66b6168f1e086957e0bd15\" pid:5681 exit_status:1 exited_at:{seconds:1742866467 nanos:621344926}" Mar 25 01:34:28.126586 containerd[1528]: time="2025-03-25T01:34:28.126209117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" id:\"9288925efeb3db05beb062a60a344457fb6b187afafcce2ef336334eb9746ba6\" pid:5770 exit_status:1 exited_at:{seconds:1742866468 nanos:125556856}" Mar 25 01:34:28.466916 sshd[5369]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:28.826122 sshd-session[5839]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:29.069151 containerd[1528]: time="2025-03-25T01:34:29.069070583Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" id:\"16b030f5cfd343ffe261da0d416200d3537503fe21fade4ed3b7fc6b28ebb5f3\" pid:5854 exit_status:1 exited_at:{seconds:1742866469 nanos:68376536}" Mar 25 01:34:30.285079 sshd[5369]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:30.662018 sshd-session[5863]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:32.062051 sshd[5369]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:32.255522 sshd[5369]: Received disconnect from 218.92.0.225 port 45354:11: [preauth] Mar 25 01:34:32.255522 sshd[5369]: Disconnected from authenticating user root 218.92.0.225 port 45354 [preauth] Mar 25 01:34:32.257724 systemd[1]: sshd@16-10.243.75.178:22-218.92.0.225:45354.service: Deactivated successfully. Mar 25 01:34:33.505762 systemd[1]: Started sshd@17-10.243.75.178:22-218.92.0.225:49644.service - OpenSSH per-connection server daemon (218.92.0.225:49644). Mar 25 01:34:35.002703 sshd-session[5879]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:37.292832 sshd[5869]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:37.724523 sshd-session[5883]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:39.525464 systemd[1]: Started sshd@18-10.243.75.178:22-139.178.68.195:58324.service - OpenSSH per-connection server daemon (139.178.68.195:58324). Mar 25 01:34:39.762985 sshd[5869]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:40.167084 sshd-session[5899]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:40.472307 sshd[5896]: Accepted publickey for core from 139.178.68.195 port 58324 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:34:40.475720 sshd-session[5896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:34:40.486028 systemd-logind[1504]: New session 12 of user core. Mar 25 01:34:40.492670 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:34:41.767817 sshd[5900]: Connection closed by 139.178.68.195 port 58324 Mar 25 01:34:41.769068 sshd-session[5896]: pam_unix(sshd:session): session closed for user core Mar 25 01:34:41.778756 systemd[1]: sshd@18-10.243.75.178:22-139.178.68.195:58324.service: Deactivated successfully. Mar 25 01:34:41.784251 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:34:41.786112 systemd-logind[1504]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:34:41.788515 systemd-logind[1504]: Removed session 12. Mar 25 01:34:42.370031 kubelet[2918]: I0325 01:34:42.369804 2918 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:34:42.411609 kubelet[2918]: I0325 01:34:42.409207 2918 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-76f4756b5c-lc2zl" podStartSLOduration=20.40342465 podStartE2EDuration="20.40342465s" podCreationTimestamp="2025-03-25 01:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:34:27.004115201 +0000 UTC m=+67.156725368" watchObservedRunningTime="2025-03-25 01:34:42.40342465 +0000 UTC m=+82.556034808" Mar 25 01:34:42.469809 sshd[5869]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:42.669960 sshd[5869]: Received disconnect from 218.92.0.225 port 49644:11: [preauth] Mar 25 01:34:42.669960 sshd[5869]: Disconnected from authenticating user root 218.92.0.225 port 49644 [preauth] Mar 25 01:34:42.672697 systemd[1]: sshd@17-10.243.75.178:22-218.92.0.225:49644.service: Deactivated successfully. Mar 25 01:34:43.905384 systemd[1]: Started sshd@19-10.243.75.178:22-218.92.0.225:27270.service - OpenSSH per-connection server daemon (218.92.0.225:27270). Mar 25 01:34:45.426807 sshd-session[5920]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:46.926882 systemd[1]: Started sshd@20-10.243.75.178:22-139.178.68.195:54422.service - OpenSSH per-connection server daemon (139.178.68.195:54422). Mar 25 01:34:47.882387 sshd[5918]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:47.905343 sshd[5922]: Accepted publickey for core from 139.178.68.195 port 54422 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:34:47.908670 sshd-session[5922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:34:47.917978 systemd-logind[1504]: New session 13 of user core. Mar 25 01:34:47.926038 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:34:48.298756 sshd-session[5925]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:48.762764 sshd[5924]: Connection closed by 139.178.68.195 port 54422 Mar 25 01:34:48.763804 sshd-session[5922]: pam_unix(sshd:session): session closed for user core Mar 25 01:34:48.770573 systemd[1]: sshd@20-10.243.75.178:22-139.178.68.195:54422.service: Deactivated successfully. Mar 25 01:34:48.773987 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:34:48.775889 systemd-logind[1504]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:34:48.777570 systemd-logind[1504]: Removed session 13. Mar 25 01:34:50.137368 containerd[1528]: time="2025-03-25T01:34:50.137255266Z" level=info msg="TaskExit event in podsandbox handler container_id:\"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\" id:\"b936d55d9eaa5e5130969a2f29d2288fad50003588708564b40d4892353416eb\" pid:5957 exited_at:{seconds:1742866490 nanos:136373628}" Mar 25 01:34:50.499644 sshd[5918]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:50.922005 sshd-session[5970]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=218.92.0.225 user=root Mar 25 01:34:53.068531 sshd[5918]: PAM: Permission denied for root from 218.92.0.225 Mar 25 01:34:53.271920 sshd[5918]: Received disconnect from 218.92.0.225 port 27270:11: [preauth] Mar 25 01:34:53.271920 sshd[5918]: Disconnected from authenticating user root 218.92.0.225 port 27270 [preauth] Mar 25 01:34:53.277734 systemd[1]: sshd@19-10.243.75.178:22-218.92.0.225:27270.service: Deactivated successfully. Mar 25 01:34:53.923394 systemd[1]: Started sshd@21-10.243.75.178:22-139.178.68.195:54430.service - OpenSSH per-connection server daemon (139.178.68.195:54430). Mar 25 01:34:54.864617 sshd[5974]: Accepted publickey for core from 139.178.68.195 port 54430 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:34:54.868291 sshd-session[5974]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:34:54.881633 systemd-logind[1504]: New session 14 of user core. Mar 25 01:34:54.888725 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:34:55.656311 sshd[5976]: Connection closed by 139.178.68.195 port 54430 Mar 25 01:34:55.658679 sshd-session[5974]: pam_unix(sshd:session): session closed for user core Mar 25 01:34:55.668547 systemd[1]: sshd@21-10.243.75.178:22-139.178.68.195:54430.service: Deactivated successfully. Mar 25 01:34:55.673319 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:34:55.676586 systemd-logind[1504]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:34:55.678978 systemd-logind[1504]: Removed session 14. Mar 25 01:34:55.811691 systemd[1]: Started sshd@22-10.243.75.178:22-139.178.68.195:37814.service - OpenSSH per-connection server daemon (139.178.68.195:37814). Mar 25 01:34:55.900301 containerd[1528]: time="2025-03-25T01:34:55.900180502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" id:\"4bd8249e4b9d4089b67c71762a811157e1b71f2fef9b2a89d9119646b8a9a198\" pid:6004 exited_at:{seconds:1742866495 nanos:899189656}" Mar 25 01:34:56.787999 sshd[5995]: Accepted publickey for core from 139.178.68.195 port 37814 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:34:56.789066 sshd-session[5995]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:34:56.802382 systemd-logind[1504]: New session 15 of user core. Mar 25 01:34:56.807943 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:34:57.691235 sshd[6013]: Connection closed by 139.178.68.195 port 37814 Mar 25 01:34:57.695150 sshd-session[5995]: pam_unix(sshd:session): session closed for user core Mar 25 01:34:57.710105 systemd[1]: sshd@22-10.243.75.178:22-139.178.68.195:37814.service: Deactivated successfully. Mar 25 01:34:57.713394 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:34:57.715183 systemd-logind[1504]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:34:57.718234 systemd-logind[1504]: Removed session 15. Mar 25 01:34:57.848907 systemd[1]: Started sshd@23-10.243.75.178:22-139.178.68.195:37820.service - OpenSSH per-connection server daemon (139.178.68.195:37820). Mar 25 01:34:58.805079 sshd[6026]: Accepted publickey for core from 139.178.68.195 port 37820 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:34:58.807474 sshd-session[6026]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:34:58.815910 systemd-logind[1504]: New session 16 of user core. Mar 25 01:34:58.821636 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:34:59.600382 sshd[6031]: Connection closed by 139.178.68.195 port 37820 Mar 25 01:34:59.602857 sshd-session[6026]: pam_unix(sshd:session): session closed for user core Mar 25 01:34:59.609766 systemd-logind[1504]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:34:59.611494 systemd[1]: sshd@23-10.243.75.178:22-139.178.68.195:37820.service: Deactivated successfully. Mar 25 01:34:59.617271 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:34:59.620488 systemd-logind[1504]: Removed session 16. Mar 25 01:35:04.759142 systemd[1]: Started sshd@24-10.243.75.178:22-139.178.68.195:37824.service - OpenSSH per-connection server daemon (139.178.68.195:37824). Mar 25 01:35:05.671777 sshd[6043]: Accepted publickey for core from 139.178.68.195 port 37824 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:05.674240 sshd-session[6043]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:05.682708 systemd-logind[1504]: New session 17 of user core. Mar 25 01:35:05.688749 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:35:06.408741 sshd[6045]: Connection closed by 139.178.68.195 port 37824 Mar 25 01:35:06.412386 sshd-session[6043]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:06.424172 systemd[1]: sshd@24-10.243.75.178:22-139.178.68.195:37824.service: Deactivated successfully. Mar 25 01:35:06.427656 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:35:06.429077 systemd-logind[1504]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:35:06.430626 systemd-logind[1504]: Removed session 17. Mar 25 01:35:06.562164 systemd[1]: Started sshd@25-10.243.75.178:22-139.178.68.195:37826.service - OpenSSH per-connection server daemon (139.178.68.195:37826). Mar 25 01:35:07.509089 sshd[6057]: Accepted publickey for core from 139.178.68.195 port 37826 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:07.511778 sshd-session[6057]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:07.520336 systemd-logind[1504]: New session 18 of user core. Mar 25 01:35:07.526629 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:35:08.572980 sshd[6061]: Connection closed by 139.178.68.195 port 37826 Mar 25 01:35:08.577522 sshd-session[6057]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:08.586588 systemd[1]: sshd@25-10.243.75.178:22-139.178.68.195:37826.service: Deactivated successfully. Mar 25 01:35:08.591557 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:35:08.593300 systemd-logind[1504]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:35:08.595511 systemd-logind[1504]: Removed session 18. Mar 25 01:35:08.733168 systemd[1]: Started sshd@26-10.243.75.178:22-139.178.68.195:37832.service - OpenSSH per-connection server daemon (139.178.68.195:37832). Mar 25 01:35:09.665401 sshd[6078]: Accepted publickey for core from 139.178.68.195 port 37832 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:09.667968 sshd-session[6078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:09.682881 systemd-logind[1504]: New session 19 of user core. Mar 25 01:35:09.690675 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:35:13.366513 sshd[6080]: Connection closed by 139.178.68.195 port 37832 Mar 25 01:35:13.369946 sshd-session[6078]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:13.388077 systemd[1]: sshd@26-10.243.75.178:22-139.178.68.195:37832.service: Deactivated successfully. Mar 25 01:35:13.392794 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:35:13.393362 systemd[1]: session-19.scope: Consumed 786ms CPU time, 70.6M memory peak. Mar 25 01:35:13.394910 systemd-logind[1504]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:35:13.398095 systemd-logind[1504]: Removed session 19. Mar 25 01:35:13.526639 systemd[1]: Started sshd@27-10.243.75.178:22-139.178.68.195:37846.service - OpenSSH per-connection server daemon (139.178.68.195:37846). Mar 25 01:35:14.491420 sshd[6099]: Accepted publickey for core from 139.178.68.195 port 37846 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:14.494488 sshd-session[6099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:14.502882 systemd-logind[1504]: New session 20 of user core. Mar 25 01:35:14.508661 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:35:15.639871 sshd[6101]: Connection closed by 139.178.68.195 port 37846 Mar 25 01:35:15.641286 sshd-session[6099]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:15.648607 systemd[1]: sshd@27-10.243.75.178:22-139.178.68.195:37846.service: Deactivated successfully. Mar 25 01:35:15.652895 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:35:15.654290 systemd-logind[1504]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:35:15.656354 systemd-logind[1504]: Removed session 20. Mar 25 01:35:15.798983 systemd[1]: Started sshd@28-10.243.75.178:22-139.178.68.195:36064.service - OpenSSH per-connection server daemon (139.178.68.195:36064). Mar 25 01:35:16.708211 sshd[6111]: Accepted publickey for core from 139.178.68.195 port 36064 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:16.711068 sshd-session[6111]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:16.720844 systemd-logind[1504]: New session 21 of user core. Mar 25 01:35:16.725755 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:35:17.433310 sshd[6113]: Connection closed by 139.178.68.195 port 36064 Mar 25 01:35:17.434356 sshd-session[6111]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:17.438578 systemd[1]: sshd@28-10.243.75.178:22-139.178.68.195:36064.service: Deactivated successfully. Mar 25 01:35:17.438672 systemd-logind[1504]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:35:17.441800 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:35:17.444162 systemd-logind[1504]: Removed session 21. Mar 25 01:35:20.171661 containerd[1528]: time="2025-03-25T01:35:20.146499822Z" level=info msg="TaskExit event in podsandbox handler container_id:\"524131ead2e133112dc0b2fec7d7a82d759a7354cda154ef7f877374f1a142cd\" id:\"9bc16b08ec93fc3b34ae207834ffac5dcdd9c6030683d579dc00df09824a2175\" pid:6138 exited_at:{seconds:1742866520 nanos:126540522}" Mar 25 01:35:20.223486 containerd[1528]: time="2025-03-25T01:35:20.223297912Z" level=info msg="StopPodSandbox for \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\"" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.566 [WARNING][6165] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.568 [INFO][6165] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.568 [INFO][6165] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" iface="eth0" netns="" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.568 [INFO][6165] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.568 [INFO][6165] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.757 [INFO][6173] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.760 [INFO][6173] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.760 [INFO][6173] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.775 [WARNING][6173] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.775 [INFO][6173] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.777 [INFO][6173] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:35:20.780972 containerd[1528]: 2025-03-25 01:35:20.779 [INFO][6165] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.786016 containerd[1528]: time="2025-03-25T01:35:20.785966320Z" level=info msg="TearDown network for sandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" successfully" Mar 25 01:35:20.786109 containerd[1528]: time="2025-03-25T01:35:20.786017745Z" level=info msg="StopPodSandbox for \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" returns successfully" Mar 25 01:35:20.812168 containerd[1528]: time="2025-03-25T01:35:20.812098745Z" level=info msg="RemovePodSandbox for \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\"" Mar 25 01:35:20.821240 containerd[1528]: time="2025-03-25T01:35:20.821176934Z" level=info msg="Forcibly stopping sandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\"" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.909 [WARNING][6191] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" WorkloadEndpoint="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.909 [INFO][6191] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.909 [INFO][6191] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" iface="eth0" netns="" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.909 [INFO][6191] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.909 [INFO][6191] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.944 [INFO][6198] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.944 [INFO][6198] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.944 [INFO][6198] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.958 [WARNING][6198] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.958 [INFO][6198] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" HandleID="k8s-pod-network.249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Workload="srv--y0b1r.gb1.brightbox.com-k8s-calico--kube--controllers--849f5dfb9c--6lwwx-eth0" Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.960 [INFO][6198] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:35:20.967350 containerd[1528]: 2025-03-25 01:35:20.964 [INFO][6191] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe" Mar 25 01:35:20.967350 containerd[1528]: time="2025-03-25T01:35:20.967258778Z" level=info msg="TearDown network for sandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" successfully" Mar 25 01:35:21.035355 containerd[1528]: time="2025-03-25T01:35:21.035179268Z" level=info msg="Ensure that sandbox 249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe in task-service has been cleanup successfully" Mar 25 01:35:21.126648 containerd[1528]: time="2025-03-25T01:35:21.126569109Z" level=info msg="RemovePodSandbox \"249fb1f6abe5495ed438ed1d012d151918547d3f33c75bb9766179bf7bf2bebe\" returns successfully" Mar 25 01:35:21.127968 containerd[1528]: time="2025-03-25T01:35:21.127765116Z" level=info msg="StopPodSandbox for \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\"" Mar 25 01:35:21.128373 containerd[1528]: time="2025-03-25T01:35:21.128343734Z" level=info msg="TearDown network for sandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" successfully" Mar 25 01:35:21.128546 containerd[1528]: time="2025-03-25T01:35:21.128521855Z" level=info msg="StopPodSandbox for \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" returns successfully" Mar 25 01:35:21.129705 containerd[1528]: time="2025-03-25T01:35:21.129022632Z" level=info msg="RemovePodSandbox for \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\"" Mar 25 01:35:21.129705 containerd[1528]: time="2025-03-25T01:35:21.129056458Z" level=info msg="Forcibly stopping sandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\"" Mar 25 01:35:21.129705 containerd[1528]: time="2025-03-25T01:35:21.129150807Z" level=info msg="TearDown network for sandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" successfully" Mar 25 01:35:21.132124 containerd[1528]: time="2025-03-25T01:35:21.132082842Z" level=info msg="Ensure that sandbox 3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b in task-service has been cleanup successfully" Mar 25 01:35:21.136841 containerd[1528]: time="2025-03-25T01:35:21.136810846Z" level=info msg="RemovePodSandbox \"3018991409c51c3961b5458edf17b4eef0406b3bbd57a46c887d5332551e654b\" returns successfully" Mar 25 01:35:21.137305 systemd[1]: sshd@13-10.243.75.178:22-218.92.0.221:39028.service: Deactivated successfully. Mar 25 01:35:22.597772 systemd[1]: Started sshd@29-10.243.75.178:22-139.178.68.195:36070.service - OpenSSH per-connection server daemon (139.178.68.195:36070). Mar 25 01:35:23.589284 sshd[6208]: Accepted publickey for core from 139.178.68.195 port 36070 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:23.592710 sshd-session[6208]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:23.603735 systemd-logind[1504]: New session 22 of user core. Mar 25 01:35:23.609697 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 01:35:24.703119 sshd[6210]: Connection closed by 139.178.68.195 port 36070 Mar 25 01:35:24.704297 sshd-session[6208]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:24.712100 systemd-logind[1504]: Session 22 logged out. Waiting for processes to exit. Mar 25 01:35:24.714310 systemd[1]: sshd@29-10.243.75.178:22-139.178.68.195:36070.service: Deactivated successfully. Mar 25 01:35:24.718471 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 01:35:24.726271 systemd-logind[1504]: Removed session 22. Mar 25 01:35:26.161100 containerd[1528]: time="2025-03-25T01:35:26.160849443Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" id:\"321a6a645fc7fe09749a7543548b6b6a0532265d0baba038fdb8dab39d0c432a\" pid:6241 exited_at:{seconds:1742866526 nanos:112913497}" Mar 25 01:35:26.161100 containerd[1528]: time="2025-03-25T01:35:26.160994538Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f502560f61d1bbb3c78a58a8e1db6dfc2b55278d2d84a54c7ce5ea469c037d3c\" id:\"253764a2a968d7e76e2192174757ef1f2e2b56e36b6599bec9ced070f179057c\" pid:6249 exited_at:{seconds:1742866526 nanos:121338227}" Mar 25 01:35:29.868203 systemd[1]: Started sshd@30-10.243.75.178:22-139.178.68.195:33638.service - OpenSSH per-connection server daemon (139.178.68.195:33638). Mar 25 01:35:30.838267 sshd[6263]: Accepted publickey for core from 139.178.68.195 port 33638 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:35:30.840908 sshd-session[6263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:35:30.851662 systemd-logind[1504]: New session 23 of user core. Mar 25 01:35:30.857692 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 01:35:31.640700 sshd[6265]: Connection closed by 139.178.68.195 port 33638 Mar 25 01:35:31.641831 sshd-session[6263]: pam_unix(sshd:session): session closed for user core Mar 25 01:35:31.647739 systemd[1]: sshd@30-10.243.75.178:22-139.178.68.195:33638.service: Deactivated successfully. Mar 25 01:35:31.651027 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 01:35:31.652969 systemd-logind[1504]: Session 23 logged out. Waiting for processes to exit. Mar 25 01:35:31.655070 systemd-logind[1504]: Removed session 23.