Mar 25 01:58:58.027230 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 24 23:38:35 -00 2025 Mar 25 01:58:58.027274 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:58:58.027289 kernel: BIOS-provided physical RAM map: Mar 25 01:58:58.027299 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 25 01:58:58.027313 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 25 01:58:58.027323 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 25 01:58:58.027334 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Mar 25 01:58:58.027344 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Mar 25 01:58:58.027354 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 25 01:58:58.027364 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 25 01:58:58.027374 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 25 01:58:58.027384 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 25 01:58:58.027407 kernel: NX (Execute Disable) protection: active Mar 25 01:58:58.027436 kernel: APIC: Static calls initialized Mar 25 01:58:58.027449 kernel: SMBIOS 2.8 present. Mar 25 01:58:58.027466 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Mar 25 01:58:58.027476 kernel: Hypervisor detected: KVM Mar 25 01:58:58.027487 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 25 01:58:58.027503 kernel: kvm-clock: using sched offset of 4593240724 cycles Mar 25 01:58:58.027553 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 25 01:58:58.027578 kernel: tsc: Detected 2799.998 MHz processor Mar 25 01:58:58.027590 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 25 01:58:58.027601 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 25 01:58:58.027612 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Mar 25 01:58:58.027623 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 25 01:58:58.027634 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 25 01:58:58.027645 kernel: Using GB pages for direct mapping Mar 25 01:58:58.027662 kernel: ACPI: Early table checksum verification disabled Mar 25 01:58:58.027673 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Mar 25 01:58:58.027685 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027696 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027707 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027718 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Mar 25 01:58:58.027729 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027740 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027751 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027767 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:58:58.027778 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Mar 25 01:58:58.027789 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Mar 25 01:58:58.027800 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Mar 25 01:58:58.027817 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Mar 25 01:58:58.027828 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Mar 25 01:58:58.027844 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Mar 25 01:58:58.027855 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Mar 25 01:58:58.027867 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 25 01:58:58.027878 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 25 01:58:58.029800 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Mar 25 01:58:58.029817 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Mar 25 01:58:58.029829 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Mar 25 01:58:58.029840 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Mar 25 01:58:58.029859 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Mar 25 01:58:58.029870 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Mar 25 01:58:58.029882 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Mar 25 01:58:58.029936 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Mar 25 01:58:58.029948 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Mar 25 01:58:58.029960 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Mar 25 01:58:58.029971 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Mar 25 01:58:58.029982 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Mar 25 01:58:58.029994 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Mar 25 01:58:58.030005 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Mar 25 01:58:58.030023 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 25 01:58:58.030034 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Mar 25 01:58:58.030046 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Mar 25 01:58:58.030058 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Mar 25 01:58:58.030069 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Mar 25 01:58:58.030097 kernel: Zone ranges: Mar 25 01:58:58.030108 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 25 01:58:58.030119 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Mar 25 01:58:58.030130 kernel: Normal empty Mar 25 01:58:58.030145 kernel: Movable zone start for each node Mar 25 01:58:58.030157 kernel: Early memory node ranges Mar 25 01:58:58.030168 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 25 01:58:58.030179 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Mar 25 01:58:58.030190 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Mar 25 01:58:58.030219 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 25 01:58:58.030635 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 25 01:58:58.030652 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Mar 25 01:58:58.030664 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 25 01:58:58.030681 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 25 01:58:58.030693 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 25 01:58:58.030705 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 25 01:58:58.030717 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 25 01:58:58.030728 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 25 01:58:58.030740 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 25 01:58:58.030752 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 25 01:58:58.030763 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 25 01:58:58.030774 kernel: TSC deadline timer available Mar 25 01:58:58.030786 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Mar 25 01:58:58.030802 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 25 01:58:58.030814 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 25 01:58:58.030825 kernel: Booting paravirtualized kernel on KVM Mar 25 01:58:58.030837 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 25 01:58:58.030849 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Mar 25 01:58:58.030860 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Mar 25 01:58:58.030872 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Mar 25 01:58:58.030883 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Mar 25 01:58:58.030937 kernel: kvm-guest: PV spinlocks enabled Mar 25 01:58:58.030950 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 25 01:58:58.030963 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:58:58.030975 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:58:58.030987 kernel: random: crng init done Mar 25 01:58:58.030998 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:58:58.031010 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 25 01:58:58.031021 kernel: Fallback order for Node 0: 0 Mar 25 01:58:58.031037 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Mar 25 01:58:58.031049 kernel: Policy zone: DMA32 Mar 25 01:58:58.031061 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:58:58.031072 kernel: software IO TLB: area num 16. Mar 25 01:58:58.031084 kernel: Memory: 1897432K/2096616K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 198924K reserved, 0K cma-reserved) Mar 25 01:58:58.031096 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Mar 25 01:58:58.031119 kernel: Kernel/User page tables isolation: enabled Mar 25 01:58:58.031130 kernel: ftrace: allocating 37985 entries in 149 pages Mar 25 01:58:58.031141 kernel: ftrace: allocated 149 pages with 4 groups Mar 25 01:58:58.031156 kernel: Dynamic Preempt: voluntary Mar 25 01:58:58.031167 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:58:58.031179 kernel: rcu: RCU event tracing is enabled. Mar 25 01:58:58.031191 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Mar 25 01:58:58.031203 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:58:58.031224 kernel: Rude variant of Tasks RCU enabled. Mar 25 01:58:58.031240 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:58:58.031252 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:58:58.031263 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Mar 25 01:58:58.031275 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Mar 25 01:58:58.031286 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:58:58.031298 kernel: Console: colour VGA+ 80x25 Mar 25 01:58:58.031329 kernel: printk: console [tty0] enabled Mar 25 01:58:58.031341 kernel: printk: console [ttyS0] enabled Mar 25 01:58:58.031353 kernel: ACPI: Core revision 20230628 Mar 25 01:58:58.031365 kernel: APIC: Switch to symmetric I/O mode setup Mar 25 01:58:58.031377 kernel: x2apic enabled Mar 25 01:58:58.031393 kernel: APIC: Switched APIC routing to: physical x2apic Mar 25 01:58:58.031406 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 25 01:58:58.031418 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Mar 25 01:58:58.031430 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 25 01:58:58.031442 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 25 01:58:58.031454 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 25 01:58:58.031467 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 25 01:58:58.031479 kernel: Spectre V2 : Mitigation: Retpolines Mar 25 01:58:58.031490 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 25 01:58:58.031502 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 25 01:58:58.031518 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Mar 25 01:58:58.031530 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 25 01:58:58.031542 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 25 01:58:58.031554 kernel: MDS: Mitigation: Clear CPU buffers Mar 25 01:58:58.031578 kernel: MMIO Stale Data: Unknown: No mitigations Mar 25 01:58:58.031590 kernel: SRBDS: Unknown: Dependent on hypervisor status Mar 25 01:58:58.031602 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 25 01:58:58.031614 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 25 01:58:58.031626 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 25 01:58:58.031638 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 25 01:58:58.031650 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 25 01:58:58.031667 kernel: Freeing SMP alternatives memory: 32K Mar 25 01:58:58.031679 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:58:58.031690 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:58:58.031702 kernel: landlock: Up and running. Mar 25 01:58:58.031714 kernel: SELinux: Initializing. Mar 25 01:58:58.031726 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:58:58.031738 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:58:58.031750 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Mar 25 01:58:58.031762 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 01:58:58.031774 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 01:58:58.031786 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Mar 25 01:58:58.031803 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Mar 25 01:58:58.031815 kernel: signal: max sigframe size: 1776 Mar 25 01:58:58.031827 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:58:58.031840 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:58:58.031852 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 25 01:58:58.031864 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:58:58.031876 kernel: smpboot: x86: Booting SMP configuration: Mar 25 01:58:58.031899 kernel: .... node #0, CPUs: #1 Mar 25 01:58:58.031914 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Mar 25 01:58:58.031931 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:58:58.031944 kernel: smpboot: Max logical packages: 16 Mar 25 01:58:58.031956 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) Mar 25 01:58:58.031968 kernel: devtmpfs: initialized Mar 25 01:58:58.031980 kernel: x86/mm: Memory block size: 128MB Mar 25 01:58:58.031992 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:58:58.032004 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Mar 25 01:58:58.032016 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:58:58.032028 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:58:58.032045 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:58:58.032057 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:58:58.032069 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 25 01:58:58.032081 kernel: audit: type=2000 audit(1742867937.009:1): state=initialized audit_enabled=0 res=1 Mar 25 01:58:58.032093 kernel: cpuidle: using governor menu Mar 25 01:58:58.032105 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:58:58.032117 kernel: dca service started, version 1.12.1 Mar 25 01:58:58.032129 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 25 01:58:58.032141 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Mar 25 01:58:58.032158 kernel: PCI: Using configuration type 1 for base access Mar 25 01:58:58.032170 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 25 01:58:58.032182 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:58:58.032194 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:58:58.032206 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:58:58.032218 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:58:58.032230 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:58:58.032242 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:58:58.032254 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:58:58.032270 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:58:58.032282 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:58:58.032294 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 25 01:58:58.032306 kernel: ACPI: Interpreter enabled Mar 25 01:58:58.032319 kernel: ACPI: PM: (supports S0 S5) Mar 25 01:58:58.032331 kernel: ACPI: Using IOAPIC for interrupt routing Mar 25 01:58:58.032343 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 25 01:58:58.032355 kernel: PCI: Using E820 reservations for host bridge windows Mar 25 01:58:58.032367 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 25 01:58:58.032388 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:58:58.032708 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:58:58.032881 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:58:58.034899 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:58:58.034932 kernel: PCI host bridge to bus 0000:00 Mar 25 01:58:58.035133 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 25 01:58:58.035325 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 25 01:58:58.035501 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 25 01:58:58.035697 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Mar 25 01:58:58.035861 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 25 01:58:58.039880 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Mar 25 01:58:58.040067 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:58:58.040258 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 25 01:58:58.040466 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Mar 25 01:58:58.040647 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Mar 25 01:58:58.040812 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Mar 25 01:58:58.042069 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Mar 25 01:58:58.042248 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 25 01:58:58.042460 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.042664 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Mar 25 01:58:58.042869 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.043076 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Mar 25 01:58:58.043253 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.043436 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Mar 25 01:58:58.043639 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.043808 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Mar 25 01:58:58.044091 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.044256 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Mar 25 01:58:58.044427 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.044631 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Mar 25 01:58:58.044803 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.047058 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Mar 25 01:58:58.047267 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 25 01:58:58.047441 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Mar 25 01:58:58.047632 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 25 01:58:58.047799 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Mar 25 01:58:58.048013 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Mar 25 01:58:58.048195 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 25 01:58:58.048367 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Mar 25 01:58:58.048615 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 25 01:58:58.048780 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 25 01:58:58.049777 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Mar 25 01:58:58.049993 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Mar 25 01:58:58.050183 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 25 01:58:58.050371 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 25 01:58:58.050554 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 25 01:58:58.050738 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Mar 25 01:58:58.051968 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Mar 25 01:58:58.052192 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 25 01:58:58.052365 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 25 01:58:58.052557 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Mar 25 01:58:58.052773 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Mar 25 01:58:58.053991 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 25 01:58:58.054187 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 25 01:58:58.054377 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:58:58.054599 kernel: pci_bus 0000:02: extended config space not accessible Mar 25 01:58:58.054788 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Mar 25 01:58:58.055088 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Mar 25 01:58:58.055309 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 25 01:58:58.055501 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:58:58.055699 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 25 01:58:58.055878 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Mar 25 01:58:58.056086 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 25 01:58:58.056264 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:58:58.056449 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:58:58.056673 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 25 01:58:58.056845 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 25 01:58:58.057411 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 25 01:58:58.057615 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:58:58.057781 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:58:58.060088 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 25 01:58:58.060276 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:58:58.060453 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:58:58.060671 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 25 01:58:58.060838 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:58:58.062119 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:58:58.062332 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 25 01:58:58.062512 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:58:58.062711 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:58:58.062886 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 25 01:58:58.063090 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:58:58.063262 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:58:58.063440 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 25 01:58:58.063653 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:58:58.063819 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:58:58.063838 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 25 01:58:58.063851 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 25 01:58:58.063864 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 25 01:58:58.063876 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 25 01:58:58.065939 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 25 01:58:58.065963 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 25 01:58:58.065976 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 25 01:58:58.065988 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 25 01:58:58.066001 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 25 01:58:58.066019 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 25 01:58:58.066031 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 25 01:58:58.066043 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 25 01:58:58.066056 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 25 01:58:58.066076 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 25 01:58:58.066089 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 25 01:58:58.066101 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 25 01:58:58.066113 kernel: iommu: Default domain type: Translated Mar 25 01:58:58.066126 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 25 01:58:58.066138 kernel: PCI: Using ACPI for IRQ routing Mar 25 01:58:58.066150 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 25 01:58:58.066163 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 25 01:58:58.066175 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Mar 25 01:58:58.066367 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 25 01:58:58.066597 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 25 01:58:58.066767 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 25 01:58:58.066787 kernel: vgaarb: loaded Mar 25 01:58:58.066799 kernel: clocksource: Switched to clocksource kvm-clock Mar 25 01:58:58.066812 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:58:58.066824 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:58:58.066837 kernel: pnp: PnP ACPI init Mar 25 01:58:58.068990 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 25 01:58:58.069022 kernel: pnp: PnP ACPI: found 5 devices Mar 25 01:58:58.069035 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 25 01:58:58.069048 kernel: NET: Registered PF_INET protocol family Mar 25 01:58:58.069060 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:58:58.069079 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 25 01:58:58.069091 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:58:58.069103 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 25 01:58:58.069124 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 25 01:58:58.069136 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 25 01:58:58.069149 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:58:58.069161 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:58:58.069174 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:58:58.069186 kernel: NET: Registered PF_XDP protocol family Mar 25 01:58:58.069354 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Mar 25 01:58:58.069530 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 25 01:58:58.069748 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 25 01:58:58.069969 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 25 01:58:58.070272 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 25 01:58:58.070509 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 25 01:58:58.070809 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 25 01:58:58.071070 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 25 01:58:58.071247 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Mar 25 01:58:58.071433 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Mar 25 01:58:58.071610 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Mar 25 01:58:58.071774 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Mar 25 01:58:58.071966 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Mar 25 01:58:58.072141 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Mar 25 01:58:58.072311 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Mar 25 01:58:58.072492 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Mar 25 01:58:58.072727 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Mar 25 01:58:58.072931 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:58:58.073107 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Mar 25 01:58:58.073291 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 25 01:58:58.073464 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Mar 25 01:58:58.073651 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:58:58.073814 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Mar 25 01:58:58.074008 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 25 01:58:58.074179 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:58:58.074371 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:58:58.074570 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Mar 25 01:58:58.074748 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 25 01:58:58.074983 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:58:58.075166 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:58:58.075350 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Mar 25 01:58:58.075523 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 25 01:58:58.075710 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:58:58.075875 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:58:58.076068 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Mar 25 01:58:58.076246 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 25 01:58:58.076410 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:58:58.076606 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:58:58.076773 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Mar 25 01:58:58.077024 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 25 01:58:58.077198 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:58:58.077385 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:58:58.077579 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Mar 25 01:58:58.077752 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 25 01:58:58.077952 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:58:58.078124 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:58:58.078286 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Mar 25 01:58:58.078448 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 25 01:58:58.078636 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:58:58.078801 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:58:58.079002 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 25 01:58:58.079154 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 25 01:58:58.079302 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 25 01:58:58.079457 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Mar 25 01:58:58.079619 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 25 01:58:58.079768 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Mar 25 01:58:58.079975 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 25 01:58:58.080136 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Mar 25 01:58:58.080292 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:58:58.080465 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 25 01:58:58.080680 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Mar 25 01:58:58.080840 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 25 01:58:58.081048 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:58:58.081223 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Mar 25 01:58:58.081388 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 25 01:58:58.081532 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:58:58.081722 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Mar 25 01:58:58.081884 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 25 01:58:58.082095 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:58:58.082270 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Mar 25 01:58:58.082436 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 25 01:58:58.082611 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:58:58.082781 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Mar 25 01:58:58.083005 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 25 01:58:58.083161 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:58:58.083323 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Mar 25 01:58:58.083478 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Mar 25 01:58:58.083648 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:58:58.083822 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Mar 25 01:58:58.084010 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 25 01:58:58.084174 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:58:58.084201 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 25 01:58:58.084214 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:58:58.084227 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 25 01:58:58.084249 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Mar 25 01:58:58.084261 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 25 01:58:58.084274 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns Mar 25 01:58:58.084287 kernel: Initialise system trusted keyrings Mar 25 01:58:58.084300 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 25 01:58:58.084319 kernel: Key type asymmetric registered Mar 25 01:58:58.084332 kernel: Asymmetric key parser 'x509' registered Mar 25 01:58:58.084345 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 25 01:58:58.084358 kernel: io scheduler mq-deadline registered Mar 25 01:58:58.084371 kernel: io scheduler kyber registered Mar 25 01:58:58.084384 kernel: io scheduler bfq registered Mar 25 01:58:58.084580 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 25 01:58:58.084757 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 25 01:58:58.084984 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.085168 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 25 01:58:58.085330 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 25 01:58:58.085493 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.085692 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 25 01:58:58.085854 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 25 01:58:58.086071 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.086255 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 25 01:58:58.086421 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 25 01:58:58.086625 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.086798 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 25 01:58:58.087011 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 25 01:58:58.087194 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.087363 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 25 01:58:58.087538 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 25 01:58:58.087729 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.087942 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 25 01:58:58.088107 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 25 01:58:58.088288 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.088460 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 25 01:58:58.088638 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 25 01:58:58.088802 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:58:58.088822 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 25 01:58:58.088849 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 25 01:58:58.088861 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 25 01:58:58.088881 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:58:58.088893 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 25 01:58:58.088930 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 25 01:58:58.088943 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 25 01:58:58.088955 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 25 01:58:58.089136 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 25 01:58:58.089155 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 25 01:58:58.089321 kernel: rtc_cmos 00:03: registered as rtc0 Mar 25 01:58:58.089482 kernel: rtc_cmos 00:03: setting system clock to 2025-03-25T01:58:57 UTC (1742867937) Mar 25 01:58:58.089652 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Mar 25 01:58:58.089672 kernel: intel_pstate: CPU model not supported Mar 25 01:58:58.089685 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:58:58.089698 kernel: Segment Routing with IPv6 Mar 25 01:58:58.089710 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:58:58.089723 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:58:58.089735 kernel: Key type dns_resolver registered Mar 25 01:58:58.089748 kernel: IPI shorthand broadcast: enabled Mar 25 01:58:58.089767 kernel: sched_clock: Marking stable (1074011873, 221042506)->(1509872495, -214818116) Mar 25 01:58:58.089780 kernel: registered taskstats version 1 Mar 25 01:58:58.089793 kernel: Loading compiled-in X.509 certificates Mar 25 01:58:58.089806 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: eff01054e94a599f8e404b9a9482f4e2220f5386' Mar 25 01:58:58.089819 kernel: Key type .fscrypt registered Mar 25 01:58:58.089831 kernel: Key type fscrypt-provisioning registered Mar 25 01:58:58.089844 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:58:58.089857 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:58:58.089873 kernel: ima: No architecture policies found Mar 25 01:58:58.089908 kernel: clk: Disabling unused clocks Mar 25 01:58:58.089925 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 25 01:58:58.089938 kernel: Write protecting the kernel read-only data: 40960k Mar 25 01:58:58.089950 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 25 01:58:58.089963 kernel: Run /init as init process Mar 25 01:58:58.089989 kernel: with arguments: Mar 25 01:58:58.090001 kernel: /init Mar 25 01:58:58.090013 kernel: with environment: Mar 25 01:58:58.090024 kernel: HOME=/ Mar 25 01:58:58.090055 kernel: TERM=linux Mar 25 01:58:58.090067 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:58:58.090087 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:58:58.090114 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:58:58.090127 systemd[1]: Detected virtualization kvm. Mar 25 01:58:58.090139 systemd[1]: Detected architecture x86-64. Mar 25 01:58:58.090151 systemd[1]: Running in initrd. Mar 25 01:58:58.090168 systemd[1]: No hostname configured, using default hostname. Mar 25 01:58:58.090181 systemd[1]: Hostname set to . Mar 25 01:58:58.090197 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:58:58.090221 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:58:58.090233 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:58:58.090246 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:58:58.090259 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:58:58.090285 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:58:58.090302 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:58:58.090317 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:58:58.090343 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:58:58.090357 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:58:58.090371 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:58:58.090384 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:58:58.090398 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:58:58.090416 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:58:58.090430 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:58:58.090444 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:58:58.090457 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:58:58.090471 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:58:58.090484 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:58:58.090497 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:58:58.090511 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:58:58.090529 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:58:58.090543 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:58:58.090556 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:58:58.090582 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:58:58.090596 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:58:58.090609 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:58:58.090622 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:58:58.090636 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:58:58.090649 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:58:58.090668 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:58:58.090682 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:58:58.090740 systemd-journald[201]: Collecting audit messages is disabled. Mar 25 01:58:58.090773 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:58:58.090794 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:58:58.090808 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:58:58.090822 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:58:58.090842 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:58:58.090860 kernel: Bridge firewalling registered Mar 25 01:58:58.090873 systemd-journald[201]: Journal started Mar 25 01:58:58.090944 systemd-journald[201]: Runtime Journal (/run/log/journal/ca6dc9962b79423398663e97d8d7c551) is 4.7M, max 37.9M, 33.2M free. Mar 25 01:58:58.025853 systemd-modules-load[203]: Inserted module 'overlay' Mar 25 01:58:58.138333 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:58:58.077039 systemd-modules-load[203]: Inserted module 'br_netfilter' Mar 25 01:58:58.139332 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:58:58.140499 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:58:58.149379 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:58:58.151044 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:58:58.159019 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:58:58.162080 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:58:58.176317 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:58:58.186540 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:58:58.191812 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:58:58.195870 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:58:58.197550 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:58:58.201067 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:58:58.229970 dracut-cmdline[236]: dracut-dracut-053 Mar 25 01:58:58.234354 dracut-cmdline[236]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:58:58.256072 systemd-resolved[235]: Positive Trust Anchors: Mar 25 01:58:58.256104 systemd-resolved[235]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:58:58.256149 systemd-resolved[235]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:58:58.265004 systemd-resolved[235]: Defaulting to hostname 'linux'. Mar 25 01:58:58.268186 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:58:58.269717 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:58:58.333943 kernel: SCSI subsystem initialized Mar 25 01:58:58.345026 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:58:58.357922 kernel: iscsi: registered transport (tcp) Mar 25 01:58:58.381999 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:58:58.382080 kernel: QLogic iSCSI HBA Driver Mar 25 01:58:58.434926 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:58:58.437358 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:58:58.481479 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:58:58.481571 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:58:58.481593 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:58:58.530950 kernel: raid6: sse2x4 gen() 13291 MB/s Mar 25 01:58:58.547924 kernel: raid6: sse2x2 gen() 9417 MB/s Mar 25 01:58:58.566412 kernel: raid6: sse2x1 gen() 10085 MB/s Mar 25 01:58:58.566493 kernel: raid6: using algorithm sse2x4 gen() 13291 MB/s Mar 25 01:58:58.585442 kernel: raid6: .... xor() 7976 MB/s, rmw enabled Mar 25 01:58:58.585516 kernel: raid6: using ssse3x2 recovery algorithm Mar 25 01:58:58.609925 kernel: xor: automatically using best checksumming function avx Mar 25 01:58:58.772926 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:58:58.787651 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:58:58.791071 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:58:58.821300 systemd-udevd[420]: Using default interface naming scheme 'v255'. Mar 25 01:58:58.830035 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:58:58.836430 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:58:58.863973 dracut-pre-trigger[425]: rd.md=0: removing MD RAID activation Mar 25 01:58:58.904427 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:58:58.907089 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:58:59.028229 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:58:59.033064 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:58:59.066713 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:58:59.068341 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:58:59.070648 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:58:59.071861 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:58:59.075418 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:58:59.106981 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:58:59.156915 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Mar 25 01:58:59.265741 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Mar 25 01:58:59.265978 kernel: cryptd: max_cpu_qlen set to 1000 Mar 25 01:58:59.265999 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:58:59.266016 kernel: GPT:17805311 != 125829119 Mar 25 01:58:59.266032 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:58:59.266048 kernel: GPT:17805311 != 125829119 Mar 25 01:58:59.266064 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:58:59.266081 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:58:59.266105 kernel: ACPI: bus type USB registered Mar 25 01:58:59.266122 kernel: usbcore: registered new interface driver usbfs Mar 25 01:58:59.266138 kernel: usbcore: registered new interface driver hub Mar 25 01:58:59.266155 kernel: usbcore: registered new device driver usb Mar 25 01:58:59.266171 kernel: libata version 3.00 loaded. Mar 25 01:58:59.266188 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 25 01:58:59.273073 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Mar 25 01:58:59.273310 kernel: AVX version of gcm_enc/dec engaged. Mar 25 01:58:59.273346 kernel: AES CTR mode by8 optimization enabled Mar 25 01:58:59.273364 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 25 01:58:59.273595 kernel: ahci 0000:00:1f.2: version 3.0 Mar 25 01:58:59.292044 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 25 01:58:59.292076 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Mar 25 01:58:59.292294 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 25 01:58:59.292841 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 25 01:58:59.293075 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Mar 25 01:58:59.293304 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Mar 25 01:58:59.293507 kernel: hub 1-0:1.0: USB hub found Mar 25 01:58:59.296018 kernel: hub 1-0:1.0: 4 ports detected Mar 25 01:58:59.296282 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 25 01:58:59.296583 kernel: hub 2-0:1.0: USB hub found Mar 25 01:58:59.296803 kernel: hub 2-0:1.0: 4 ports detected Mar 25 01:58:59.297108 kernel: scsi host0: ahci Mar 25 01:58:59.297308 kernel: scsi host1: ahci Mar 25 01:58:59.297510 kernel: scsi host2: ahci Mar 25 01:58:59.297715 kernel: scsi host3: ahci Mar 25 01:58:59.297916 kernel: scsi host4: ahci Mar 25 01:58:59.298107 kernel: scsi host5: ahci Mar 25 01:58:59.298322 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 Mar 25 01:58:59.298344 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 Mar 25 01:58:59.298361 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 Mar 25 01:58:59.298377 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 Mar 25 01:58:59.298394 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 Mar 25 01:58:59.298411 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 Mar 25 01:58:59.226284 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:58:59.385696 kernel: BTRFS: device fsid 6d9424cd-1432-492b-b006-b311869817e2 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (475) Mar 25 01:58:59.385731 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (473) Mar 25 01:58:59.226451 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:58:59.230729 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:58:59.231390 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:58:59.231586 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:58:59.232413 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:58:59.235383 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:58:59.237783 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:58:59.369826 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 25 01:58:59.385334 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:58:59.406921 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 25 01:58:59.427797 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:58:59.438035 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 25 01:58:59.438961 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 25 01:58:59.443163 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:58:59.445677 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:58:59.468289 disk-uuid[557]: Primary Header is updated. Mar 25 01:58:59.468289 disk-uuid[557]: Secondary Entries is updated. Mar 25 01:58:59.468289 disk-uuid[557]: Secondary Header is updated. Mar 25 01:58:59.476064 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:58:59.480831 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:58:59.516989 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 25 01:58:59.598911 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 25 01:58:59.601057 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 25 01:58:59.601092 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 25 01:58:59.609245 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 25 01:58:59.609290 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 25 01:58:59.610973 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 25 01:58:59.658931 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:58:59.665013 kernel: usbcore: registered new interface driver usbhid Mar 25 01:58:59.665051 kernel: usbhid: USB HID core driver Mar 25 01:58:59.673100 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 25 01:58:59.673146 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Mar 25 01:59:00.487588 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:59:00.491566 disk-uuid[562]: The operation has completed successfully. Mar 25 01:59:00.562454 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:59:00.562646 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:59:00.613908 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:59:00.629167 sh[585]: Success Mar 25 01:59:00.647839 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Mar 25 01:59:00.700822 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:59:00.713359 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:59:00.715588 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:59:00.744932 kernel: BTRFS info (device dm-0): first mount of filesystem 6d9424cd-1432-492b-b006-b311869817e2 Mar 25 01:59:00.745003 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:59:00.745024 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:59:00.747310 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:59:00.749771 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:59:00.760909 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:59:00.762245 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:59:00.765056 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:59:00.766680 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:59:00.803922 kernel: BTRFS info (device vda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:59:00.807621 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:59:00.807659 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:59:00.811914 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:59:00.819933 kernel: BTRFS info (device vda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:59:00.824769 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:59:00.827711 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:59:00.952666 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:59:00.962307 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:59:00.979292 ignition[683]: Ignition 2.20.0 Mar 25 01:59:00.979308 ignition[683]: Stage: fetch-offline Mar 25 01:59:00.979385 ignition[683]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:00.979404 ignition[683]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:00.984536 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:59:00.979576 ignition[683]: parsed url from cmdline: "" Mar 25 01:59:00.979583 ignition[683]: no config URL provided Mar 25 01:59:00.979592 ignition[683]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:59:00.979608 ignition[683]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:59:00.979617 ignition[683]: failed to fetch config: resource requires networking Mar 25 01:59:00.979857 ignition[683]: Ignition finished successfully Mar 25 01:59:01.012975 systemd-networkd[769]: lo: Link UP Mar 25 01:59:01.012989 systemd-networkd[769]: lo: Gained carrier Mar 25 01:59:01.015445 systemd-networkd[769]: Enumeration completed Mar 25 01:59:01.016027 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:59:01.016141 systemd-networkd[769]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:59:01.016147 systemd-networkd[769]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:59:01.017620 systemd-networkd[769]: eth0: Link UP Mar 25 01:59:01.017625 systemd-networkd[769]: eth0: Gained carrier Mar 25 01:59:01.017636 systemd-networkd[769]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:59:01.018753 systemd[1]: Reached target network.target - Network. Mar 25 01:59:01.022692 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:59:01.041962 systemd-networkd[769]: eth0: DHCPv4 address 10.230.42.214/30, gateway 10.230.42.213 acquired from 10.230.42.213 Mar 25 01:59:01.049029 ignition[774]: Ignition 2.20.0 Mar 25 01:59:01.049050 ignition[774]: Stage: fetch Mar 25 01:59:01.049305 ignition[774]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:01.049327 ignition[774]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:01.049458 ignition[774]: parsed url from cmdline: "" Mar 25 01:59:01.049465 ignition[774]: no config URL provided Mar 25 01:59:01.049475 ignition[774]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:59:01.049501 ignition[774]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:59:01.049706 ignition[774]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 25 01:59:01.050093 ignition[774]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 25 01:59:01.050140 ignition[774]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 25 01:59:01.066319 ignition[774]: GET result: OK Mar 25 01:59:01.067271 ignition[774]: parsing config with SHA512: 81f55a79e0af41ef8bbacf26dacd878bc79a98ee150f61a5a5469cfdb17537f8aa15c4bcc7afbf1db50b27054af8ab7f448688c221318f0c91bc65e1c4c107e9 Mar 25 01:59:01.073403 unknown[774]: fetched base config from "system" Mar 25 01:59:01.073805 ignition[774]: fetch: fetch complete Mar 25 01:59:01.073420 unknown[774]: fetched base config from "system" Mar 25 01:59:01.073814 ignition[774]: fetch: fetch passed Mar 25 01:59:01.073430 unknown[774]: fetched user config from "openstack" Mar 25 01:59:01.073875 ignition[774]: Ignition finished successfully Mar 25 01:59:01.077795 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:59:01.085694 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:59:01.113289 ignition[782]: Ignition 2.20.0 Mar 25 01:59:01.113325 ignition[782]: Stage: kargs Mar 25 01:59:01.113551 ignition[782]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:01.113571 ignition[782]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:01.115746 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:59:01.114587 ignition[782]: kargs: kargs passed Mar 25 01:59:01.114655 ignition[782]: Ignition finished successfully Mar 25 01:59:01.120076 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:59:01.139309 ignition[789]: Ignition 2.20.0 Mar 25 01:59:01.139326 ignition[789]: Stage: disks Mar 25 01:59:01.139554 ignition[789]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:01.139574 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:01.143252 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:59:01.142188 ignition[789]: disks: disks passed Mar 25 01:59:01.145098 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:59:01.142252 ignition[789]: Ignition finished successfully Mar 25 01:59:01.145869 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:59:01.147155 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:59:01.148565 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:59:01.149695 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:59:01.153378 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:59:01.185572 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 25 01:59:01.189310 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:59:01.191947 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:59:01.304150 kernel: EXT4-fs (vda9): mounted filesystem 4e6dca82-2e50-453c-be25-61f944b72008 r/w with ordered data mode. Quota mode: none. Mar 25 01:59:01.305188 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:59:01.306420 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:59:01.308797 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:59:01.312976 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:59:01.314742 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:59:01.319064 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Mar 25 01:59:01.320783 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:59:01.320832 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:59:01.331096 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:59:01.335049 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:59:01.338608 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (805) Mar 25 01:59:01.341929 kernel: BTRFS info (device vda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:59:01.348497 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:59:01.348538 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:59:01.354685 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:59:01.358139 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:59:01.432496 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:59:01.440382 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:59:01.447809 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:59:01.455568 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:59:01.555523 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:59:01.559041 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:59:01.563059 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:59:01.580961 kernel: BTRFS info (device vda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:59:01.602269 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:59:01.613930 ignition[923]: INFO : Ignition 2.20.0 Mar 25 01:59:01.613930 ignition[923]: INFO : Stage: mount Mar 25 01:59:01.615589 ignition[923]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:01.615589 ignition[923]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:01.615589 ignition[923]: INFO : mount: mount passed Mar 25 01:59:01.615589 ignition[923]: INFO : Ignition finished successfully Mar 25 01:59:01.616388 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:59:01.741769 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:59:02.231146 systemd-networkd[769]: eth0: Gained IPv6LL Mar 25 01:59:03.740128 systemd-networkd[769]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8ab5:24:19ff:fee6:2ad6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8ab5:24:19ff:fee6:2ad6/64 assigned by NDisc. Mar 25 01:59:03.740156 systemd-networkd[769]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 25 01:59:08.480323 coreos-metadata[807]: Mar 25 01:59:08.480 WARN failed to locate config-drive, using the metadata service API instead Mar 25 01:59:08.503979 coreos-metadata[807]: Mar 25 01:59:08.503 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 25 01:59:08.517228 coreos-metadata[807]: Mar 25 01:59:08.517 INFO Fetch successful Mar 25 01:59:08.518132 coreos-metadata[807]: Mar 25 01:59:08.518 INFO wrote hostname srv-u0apo.gb1.brightbox.com to /sysroot/etc/hostname Mar 25 01:59:08.519641 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 25 01:59:08.519802 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Mar 25 01:59:08.525995 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:59:08.544942 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:59:08.571909 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (939) Mar 25 01:59:08.571967 kernel: BTRFS info (device vda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:59:08.571997 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:59:08.574255 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:59:08.580085 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:59:08.582126 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:59:08.614801 ignition[957]: INFO : Ignition 2.20.0 Mar 25 01:59:08.614801 ignition[957]: INFO : Stage: files Mar 25 01:59:08.616484 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:08.616484 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:08.616484 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:59:08.618924 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:59:08.618924 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:59:08.620795 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:59:08.621727 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:59:08.621727 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:59:08.621456 unknown[957]: wrote ssh authorized keys file for user: core Mar 25 01:59:08.624508 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Mar 25 01:59:08.624508 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Mar 25 01:59:08.815743 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:59:09.121957 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Mar 25 01:59:09.121957 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:59:09.121957 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:59:09.121957 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:59:09.121957 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:59:09.121957 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:59:09.135765 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Mar 25 01:59:09.753422 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:59:11.480449 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Mar 25 01:59:11.480449 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:59:11.483679 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:59:11.483679 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:59:11.483679 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:59:11.483679 ignition[957]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:59:11.483679 ignition[957]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:59:11.483679 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:59:11.483679 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:59:11.483679 ignition[957]: INFO : files: files passed Mar 25 01:59:11.483679 ignition[957]: INFO : Ignition finished successfully Mar 25 01:59:11.485273 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:59:11.495077 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:59:11.496777 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:59:11.512629 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:59:11.512819 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:59:11.523398 initrd-setup-root-after-ignition[987]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:59:11.523398 initrd-setup-root-after-ignition[987]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:59:11.525528 initrd-setup-root-after-ignition[991]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:59:11.526674 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:59:11.530876 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:59:11.534267 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:59:11.593235 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:59:11.593427 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:59:11.595102 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:59:11.596378 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:59:11.597811 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:59:11.599537 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:59:11.629108 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:59:11.631949 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:59:11.658346 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:59:11.660139 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:59:11.661087 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:59:11.662449 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:59:11.662651 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:59:11.664263 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:59:11.665114 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:59:11.666446 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:59:11.667697 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:59:11.668857 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:59:11.670250 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:59:11.671607 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:59:11.673083 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:59:11.674412 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:59:11.675827 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:59:11.677073 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:59:11.677280 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:59:11.678853 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:59:11.679783 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:59:11.681103 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:59:11.681311 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:59:11.682451 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:59:11.682625 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:59:11.684532 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:59:11.684696 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:59:11.686145 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:59:11.686328 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:59:11.691165 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:59:11.691907 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:59:11.692146 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:59:11.697094 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:59:11.698416 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:59:11.698593 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:59:11.701105 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:59:11.701276 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:59:11.720296 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:59:11.720441 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:59:11.727935 ignition[1011]: INFO : Ignition 2.20.0 Mar 25 01:59:11.727935 ignition[1011]: INFO : Stage: umount Mar 25 01:59:11.727935 ignition[1011]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:59:11.727935 ignition[1011]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 25 01:59:11.734976 ignition[1011]: INFO : umount: umount passed Mar 25 01:59:11.734976 ignition[1011]: INFO : Ignition finished successfully Mar 25 01:59:11.734690 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:59:11.738476 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:59:11.739140 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:59:11.741523 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:59:11.741616 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:59:11.742485 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:59:11.742562 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:59:11.745053 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:59:11.745125 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:59:11.745985 systemd[1]: Stopped target network.target - Network. Mar 25 01:59:11.746610 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:59:11.746690 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:59:11.747991 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:59:11.749271 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:59:11.751155 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:59:11.752150 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:59:11.754374 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:59:11.756188 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:59:11.756283 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:59:11.757778 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:59:11.757844 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:59:11.759012 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:59:11.759101 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:59:11.759757 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:59:11.759821 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:59:11.760715 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:59:11.762388 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:59:11.765041 systemd-networkd[769]: eth0: DHCPv6 lease lost Mar 25 01:59:11.773016 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:59:11.773217 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:59:11.777768 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:59:11.778119 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:59:11.778291 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:59:11.780904 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:59:11.782147 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:59:11.782253 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:59:11.785372 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:59:11.785997 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:59:11.786070 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:59:11.786801 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:59:11.786868 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:59:11.791532 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:59:11.791604 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:59:11.793836 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:59:11.793918 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:59:11.795599 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:59:11.798460 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:59:11.798565 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:59:11.813392 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:59:11.814472 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:59:11.816081 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:59:11.816227 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:59:11.817775 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:59:11.818093 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:59:11.819167 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:59:11.819284 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:59:11.820494 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:59:11.820566 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:59:11.823216 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:59:11.823302 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:59:11.826422 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:59:11.826521 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:59:11.830492 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:59:11.832054 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:59:11.832924 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:59:11.834744 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:59:11.835361 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:59:11.838314 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:59:11.838413 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:59:11.847514 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:59:11.847693 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:59:11.865926 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:59:11.866105 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:59:11.867663 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:59:11.868695 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:59:11.868767 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:59:11.871723 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:59:11.894107 systemd[1]: Switching root. Mar 25 01:59:11.924745 systemd-journald[201]: Journal stopped Mar 25 01:59:13.434661 systemd-journald[201]: Received SIGTERM from PID 1 (systemd). Mar 25 01:59:13.434756 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:59:13.434787 kernel: SELinux: policy capability open_perms=1 Mar 25 01:59:13.434820 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:59:13.434841 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:59:13.434860 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:59:13.434878 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:59:13.435967 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:59:13.435993 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:59:13.436028 kernel: audit: type=1403 audit(1742867952.186:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:59:13.436050 systemd[1]: Successfully loaded SELinux policy in 54.194ms. Mar 25 01:59:13.436088 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 22.739ms. Mar 25 01:59:13.436128 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:59:13.436150 systemd[1]: Detected virtualization kvm. Mar 25 01:59:13.436170 systemd[1]: Detected architecture x86-64. Mar 25 01:59:13.436189 systemd[1]: Detected first boot. Mar 25 01:59:13.436223 systemd[1]: Hostname set to . Mar 25 01:59:13.436245 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:59:13.436265 zram_generator::config[1056]: No configuration found. Mar 25 01:59:13.436286 kernel: Guest personality initialized and is inactive Mar 25 01:59:13.436319 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 25 01:59:13.436339 kernel: Initialized host personality Mar 25 01:59:13.436357 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:59:13.436376 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:59:13.436398 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:59:13.436418 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:59:13.436443 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:59:13.436470 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:59:13.436492 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:59:13.436524 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:59:13.436545 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:59:13.436564 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:59:13.436584 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:59:13.436603 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:59:13.436631 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:59:13.436650 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:59:13.436670 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:59:13.436705 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:59:13.436726 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:59:13.436754 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:59:13.436775 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:59:13.436796 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:59:13.436821 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:59:13.436851 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:59:13.436873 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:59:13.437926 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:59:13.437954 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:59:13.437997 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:59:13.438029 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:59:13.438073 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:59:13.438094 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:59:13.438113 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:59:13.438132 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:59:13.438158 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:59:13.438196 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:59:13.438231 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:59:13.438252 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:59:13.438272 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:59:13.438291 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:59:13.438324 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:59:13.438345 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:59:13.438365 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:59:13.438386 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:13.438406 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:59:13.438426 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:59:13.438445 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:59:13.438472 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:59:13.438505 systemd[1]: Reached target machines.target - Containers. Mar 25 01:59:13.438526 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:59:13.438546 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:59:13.438565 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:59:13.438584 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:59:13.438604 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:59:13.438623 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:59:13.438642 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:59:13.438687 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:59:13.438710 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:59:13.438731 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:59:13.438752 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:59:13.438772 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:59:13.438791 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:59:13.438811 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:59:13.438832 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:59:13.438863 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:59:13.438884 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:59:13.439952 kernel: fuse: init (API version 7.39) Mar 25 01:59:13.439978 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:59:13.440000 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:59:13.440020 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:59:13.440039 kernel: ACPI: bus type drm_connector registered Mar 25 01:59:13.440058 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:59:13.440094 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:59:13.440124 systemd[1]: Stopped verity-setup.service. Mar 25 01:59:13.440144 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:13.440174 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:59:13.440194 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:59:13.440250 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:59:13.440272 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:59:13.440292 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:59:13.440312 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:59:13.440376 systemd-journald[1150]: Collecting audit messages is disabled. Mar 25 01:59:13.440437 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:59:13.440483 systemd-journald[1150]: Journal started Mar 25 01:59:13.440531 systemd-journald[1150]: Runtime Journal (/run/log/journal/ca6dc9962b79423398663e97d8d7c551) is 4.7M, max 37.9M, 33.2M free. Mar 25 01:59:13.043481 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:59:13.057225 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 25 01:59:13.057965 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:59:13.449265 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:59:13.449318 kernel: loop: module loaded Mar 25 01:59:13.451880 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:59:13.453120 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:59:13.453418 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:59:13.454535 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:59:13.454795 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:59:13.456072 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:59:13.456345 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:59:13.457404 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:59:13.457668 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:59:13.458875 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:59:13.459159 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:59:13.460274 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:59:13.460530 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:59:13.461670 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:59:13.462790 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:59:13.464013 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:59:13.465289 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:59:13.481352 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:59:13.486002 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:59:13.492077 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:59:13.492859 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:59:13.492918 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:59:13.494879 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:59:13.503102 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:59:13.507164 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:59:13.508014 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:59:13.511154 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:59:13.516389 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:59:13.517247 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:59:13.522147 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:59:13.523016 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:59:13.528999 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:59:13.535428 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:59:13.543559 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:59:13.549220 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:59:13.551184 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:59:13.553389 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:59:13.587564 systemd-journald[1150]: Time spent on flushing to /var/log/journal/ca6dc9962b79423398663e97d8d7c551 is 103.412ms for 1153 entries. Mar 25 01:59:13.587564 systemd-journald[1150]: System Journal (/var/log/journal/ca6dc9962b79423398663e97d8d7c551) is 8M, max 584.8M, 576.8M free. Mar 25 01:59:13.731154 systemd-journald[1150]: Received client request to flush runtime journal. Mar 25 01:59:13.731710 kernel: loop0: detected capacity change from 0 to 8 Mar 25 01:59:13.731750 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:59:13.731777 kernel: loop1: detected capacity change from 0 to 151640 Mar 25 01:59:13.594961 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:59:13.596803 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:59:13.609652 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:59:13.675978 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:59:13.718966 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:59:13.731905 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:59:13.735082 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:59:13.749240 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:59:13.793963 kernel: loop2: detected capacity change from 0 to 218376 Mar 25 01:59:13.844046 kernel: loop3: detected capacity change from 0 to 109808 Mar 25 01:59:13.880231 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:59:13.883803 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:59:13.894913 kernel: loop4: detected capacity change from 0 to 8 Mar 25 01:59:13.895306 systemd-tmpfiles[1212]: ACLs are not supported, ignoring. Mar 25 01:59:13.896308 systemd-tmpfiles[1212]: ACLs are not supported, ignoring. Mar 25 01:59:13.903208 kernel: loop5: detected capacity change from 0 to 151640 Mar 25 01:59:13.922810 udevadm[1218]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:59:13.927912 kernel: loop6: detected capacity change from 0 to 218376 Mar 25 01:59:13.930407 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:59:13.964217 kernel: loop7: detected capacity change from 0 to 109808 Mar 25 01:59:13.991711 (sd-merge)[1219]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Mar 25 01:59:13.992688 (sd-merge)[1219]: Merged extensions into '/usr'. Mar 25 01:59:14.005203 systemd[1]: Reload requested from client PID 1195 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:59:14.005354 systemd[1]: Reloading... Mar 25 01:59:14.108990 zram_generator::config[1245]: No configuration found. Mar 25 01:59:14.323580 ldconfig[1190]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:59:14.407482 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:59:14.498830 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:59:14.499301 systemd[1]: Reloading finished in 493 ms. Mar 25 01:59:14.515079 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:59:14.516413 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:59:14.535107 systemd[1]: Starting ensure-sysext.service... Mar 25 01:59:14.539241 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:59:14.574588 systemd[1]: Reload requested from client PID 1304 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:59:14.574627 systemd[1]: Reloading... Mar 25 01:59:14.576617 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:59:14.577472 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:59:14.578835 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:59:14.579439 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Mar 25 01:59:14.579665 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Mar 25 01:59:14.585626 systemd-tmpfiles[1305]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:59:14.585742 systemd-tmpfiles[1305]: Skipping /boot Mar 25 01:59:14.604965 systemd-tmpfiles[1305]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:59:14.605148 systemd-tmpfiles[1305]: Skipping /boot Mar 25 01:59:14.677964 zram_generator::config[1334]: No configuration found. Mar 25 01:59:14.859800 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:59:14.954048 systemd[1]: Reloading finished in 378 ms. Mar 25 01:59:14.970072 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:59:14.988953 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:59:14.999586 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:59:15.004194 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:59:15.009458 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:59:15.014976 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:59:15.024585 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:59:15.027861 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:59:15.034850 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:15.035201 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:59:15.038420 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:59:15.043634 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:59:15.053319 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:59:15.054149 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:59:15.054319 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:59:15.063280 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:59:15.063984 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:15.069958 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:15.070249 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:59:15.070482 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:59:15.070607 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:59:15.070731 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:15.079337 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:15.079655 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:59:15.088422 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:59:15.089740 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:59:15.089942 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:59:15.090150 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:59:15.095079 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:59:15.095567 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:59:15.109305 systemd[1]: Finished ensure-sysext.service. Mar 25 01:59:15.110595 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:59:15.110850 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:59:15.115092 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:59:15.121319 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:59:15.126969 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:59:15.128986 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:59:15.134847 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:59:15.140923 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:59:15.141270 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:59:15.143405 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:59:15.155590 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:59:15.156224 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:59:15.173979 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:59:15.182910 augenrules[1432]: No rules Mar 25 01:59:15.184558 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:59:15.184944 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:59:15.192539 systemd-udevd[1397]: Using default interface naming scheme 'v255'. Mar 25 01:59:15.193476 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:59:15.197323 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:59:15.214603 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:59:15.241150 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:59:15.250646 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:59:15.433230 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:59:15.434181 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:59:15.452848 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:59:15.455832 systemd-resolved[1395]: Positive Trust Anchors: Mar 25 01:59:15.455854 systemd-resolved[1395]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:59:15.456498 systemd-resolved[1395]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:59:15.468670 systemd-resolved[1395]: Using system hostname 'srv-u0apo.gb1.brightbox.com'. Mar 25 01:59:15.473056 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:59:15.474973 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:59:15.484011 systemd-networkd[1448]: lo: Link UP Mar 25 01:59:15.484420 systemd-networkd[1448]: lo: Gained carrier Mar 25 01:59:15.486246 systemd-networkd[1448]: Enumeration completed Mar 25 01:59:15.487471 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:59:15.488362 systemd[1]: Reached target network.target - Network. Mar 25 01:59:15.493230 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:59:15.497177 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:59:15.505941 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1455) Mar 25 01:59:15.548300 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:59:15.592981 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 25 01:59:15.591829 systemd-networkd[1448]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:59:15.591853 systemd-networkd[1448]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:59:15.593760 systemd-networkd[1448]: eth0: Link UP Mar 25 01:59:15.593772 systemd-networkd[1448]: eth0: Gained carrier Mar 25 01:59:15.593790 systemd-networkd[1448]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:59:15.607991 systemd-networkd[1448]: eth0: DHCPv4 address 10.230.42.214/30, gateway 10.230.42.213 acquired from 10.230.42.213 Mar 25 01:59:15.609108 systemd-timesyncd[1416]: Network configuration changed, trying to establish connection. Mar 25 01:59:15.628917 kernel: ACPI: button: Power Button [PWRF] Mar 25 01:59:15.639354 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 01:59:15.641401 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:59:15.644025 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:59:15.670568 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:59:15.684356 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 25 01:59:15.691198 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 25 01:59:15.691473 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 25 01:59:15.714304 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 25 01:59:15.775056 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:59:15.918335 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:59:15.964309 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:59:15.968293 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:59:15.994315 lvm[1486]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:59:16.030793 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:59:16.031954 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:59:16.032682 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:59:16.033501 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:59:16.034318 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:59:16.035298 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:59:16.036340 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:59:16.037096 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:59:16.037829 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:59:16.037908 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:59:16.038488 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:59:16.040490 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:59:16.043352 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:59:16.048123 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:59:16.049105 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:59:16.049816 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:59:16.063922 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:59:16.065676 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:59:16.068233 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:59:16.069625 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:59:16.076060 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:59:16.076724 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:59:16.077418 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:59:16.077474 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:59:16.084303 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:59:16.089180 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:59:16.089424 lvm[1490]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:59:16.094276 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:59:16.098115 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:59:16.103188 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:59:16.105207 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:59:16.111565 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:59:16.114214 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:59:16.123250 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:59:16.128188 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:59:16.138737 jq[1494]: false Mar 25 01:59:16.142574 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:59:16.145146 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:59:16.155702 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:59:16.158279 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:59:16.171685 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:59:16.175956 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:59:16.180519 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:59:16.180829 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:59:16.191806 dbus-daemon[1493]: [system] SELinux support is enabled Mar 25 01:59:16.196347 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:59:16.196706 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:59:16.198272 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:59:16.212720 dbus-daemon[1493]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1448 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 25 01:59:16.214104 jq[1505]: true Mar 25 01:59:16.238412 extend-filesystems[1495]: Found loop4 Mar 25 01:59:16.238412 extend-filesystems[1495]: Found loop5 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found loop6 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found loop7 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda1 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda2 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda3 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found usr Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda4 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda6 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda7 Mar 25 01:59:16.246016 extend-filesystems[1495]: Found vda9 Mar 25 01:59:16.246016 extend-filesystems[1495]: Checking size of /dev/vda9 Mar 25 01:59:16.240946 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:59:16.248736 dbus-daemon[1493]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 25 01:59:16.279246 update_engine[1504]: I20250325 01:59:16.255122 1504 main.cc:92] Flatcar Update Engine starting Mar 25 01:59:16.241008 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:59:16.245456 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:59:16.284114 update_engine[1504]: I20250325 01:59:16.281425 1504 update_check_scheduler.cc:74] Next update check in 2m33s Mar 25 01:59:16.245487 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:59:16.260516 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:59:16.260845 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:59:16.263402 (ntainerd)[1523]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:59:16.269419 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 25 01:59:16.271980 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:59:16.275726 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:59:16.297972 tar[1509]: linux-amd64/LICENSE Mar 25 01:59:16.297972 tar[1509]: linux-amd64/helm Mar 25 01:59:16.298378 jq[1519]: true Mar 25 01:59:16.320945 extend-filesystems[1495]: Resized partition /dev/vda9 Mar 25 01:59:16.334302 extend-filesystems[1536]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:59:16.339114 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Mar 25 01:59:16.345349 systemd-logind[1502]: Watching system buttons on /dev/input/event2 (Power Button) Mar 25 01:59:16.345392 systemd-logind[1502]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 25 01:59:16.346073 systemd-logind[1502]: New seat seat0. Mar 25 01:59:16.358012 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:59:16.516225 bash[1550]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:59:16.507752 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:59:16.521203 systemd[1]: Starting sshkeys.service... Mar 25 01:59:16.549908 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1447) Mar 25 01:59:16.565375 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:59:16.573735 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:59:16.681383 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 25 01:59:16.698604 dbus-daemon[1493]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 25 01:59:16.715455 dbus-daemon[1493]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1530 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 25 01:59:16.728937 systemd[1]: Starting polkit.service - Authorization Manager... Mar 25 01:59:16.748927 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Mar 25 01:59:16.785737 extend-filesystems[1536]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 25 01:59:16.785737 extend-filesystems[1536]: old_desc_blocks = 1, new_desc_blocks = 8 Mar 25 01:59:16.785737 extend-filesystems[1536]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Mar 25 01:59:16.796281 extend-filesystems[1495]: Resized filesystem in /dev/vda9 Mar 25 01:59:16.789527 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:59:16.791984 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:59:16.811600 polkitd[1557]: Started polkitd version 121 Mar 25 01:59:16.826607 polkitd[1557]: Loading rules from directory /etc/polkit-1/rules.d Mar 25 01:59:16.826719 polkitd[1557]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 25 01:59:16.829190 polkitd[1557]: Finished loading, compiling and executing 2 rules Mar 25 01:59:16.830504 dbus-daemon[1493]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 25 01:59:16.831580 systemd[1]: Started polkit.service - Authorization Manager. Mar 25 01:59:16.831946 polkitd[1557]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 25 01:59:16.853879 systemd-hostnamed[1530]: Hostname set to (static) Mar 25 01:59:16.872952 locksmithd[1531]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:59:17.143633 systemd-networkd[1448]: eth0: Gained IPv6LL Mar 25 01:59:17.151027 systemd-timesyncd[1416]: Network configuration changed, trying to establish connection. Mar 25 01:59:17.156832 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:59:17.158935 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:59:17.166425 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:59:17.175655 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:59:17.294924 sshd_keygen[1518]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:59:17.400034 containerd[1523]: time="2025-03-25T01:59:17Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:59:17.405778 containerd[1523]: time="2025-03-25T01:59:17.405727566Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:59:17.442251 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:59:17.445321 containerd[1523]: time="2025-03-25T01:59:17.445260741Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="23.916µs" Mar 25 01:59:17.445468 containerd[1523]: time="2025-03-25T01:59:17.445421054Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:59:17.445607 containerd[1523]: time="2025-03-25T01:59:17.445574023Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:59:17.446035 containerd[1523]: time="2025-03-25T01:59:17.445996513Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:59:17.446222 containerd[1523]: time="2025-03-25T01:59:17.446192773Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:59:17.446375 containerd[1523]: time="2025-03-25T01:59:17.446348719Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:59:17.446603 containerd[1523]: time="2025-03-25T01:59:17.446573255Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:59:17.446700 containerd[1523]: time="2025-03-25T01:59:17.446670579Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:59:17.447180 containerd[1523]: time="2025-03-25T01:59:17.447143408Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:59:17.447277 containerd[1523]: time="2025-03-25T01:59:17.447251920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:59:17.447428 containerd[1523]: time="2025-03-25T01:59:17.447398619Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:59:17.447960 containerd[1523]: time="2025-03-25T01:59:17.447925925Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:59:17.448226 containerd[1523]: time="2025-03-25T01:59:17.448193486Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:59:17.448759 containerd[1523]: time="2025-03-25T01:59:17.448725365Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:59:17.450383 containerd[1523]: time="2025-03-25T01:59:17.450346003Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:59:17.450481 containerd[1523]: time="2025-03-25T01:59:17.450455980Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:59:17.450557 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:59:17.450790 containerd[1523]: time="2025-03-25T01:59:17.450760746Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:59:17.452406 containerd[1523]: time="2025-03-25T01:59:17.452374845Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:59:17.454068 containerd[1523]: time="2025-03-25T01:59:17.454037806Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:59:17.457544 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:59:17.463096 containerd[1523]: time="2025-03-25T01:59:17.463052970Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:59:17.463331 containerd[1523]: time="2025-03-25T01:59:17.463301461Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:59:17.463465 containerd[1523]: time="2025-03-25T01:59:17.463438887Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:59:17.463560 containerd[1523]: time="2025-03-25T01:59:17.463536034Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:59:17.463658 containerd[1523]: time="2025-03-25T01:59:17.463634435Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:59:17.463792 containerd[1523]: time="2025-03-25T01:59:17.463755062Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:59:17.463934 containerd[1523]: time="2025-03-25T01:59:17.463908605Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:59:17.464046 containerd[1523]: time="2025-03-25T01:59:17.464020004Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:59:17.464205 containerd[1523]: time="2025-03-25T01:59:17.464178461Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:59:17.464317 containerd[1523]: time="2025-03-25T01:59:17.464292491Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:59:17.464427 containerd[1523]: time="2025-03-25T01:59:17.464402173Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:59:17.464522 containerd[1523]: time="2025-03-25T01:59:17.464498215Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:59:17.464839 containerd[1523]: time="2025-03-25T01:59:17.464810741Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:59:17.464984 containerd[1523]: time="2025-03-25T01:59:17.464955453Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:59:17.465152 containerd[1523]: time="2025-03-25T01:59:17.465123276Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:59:17.465250 containerd[1523]: time="2025-03-25T01:59:17.465225661Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:59:17.465342 containerd[1523]: time="2025-03-25T01:59:17.465318394Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:59:17.465446 containerd[1523]: time="2025-03-25T01:59:17.465421493Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:59:17.465582 containerd[1523]: time="2025-03-25T01:59:17.465555051Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:59:17.465719 containerd[1523]: time="2025-03-25T01:59:17.465693169Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:59:17.465837 containerd[1523]: time="2025-03-25T01:59:17.465811729Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:59:17.466145 containerd[1523]: time="2025-03-25T01:59:17.466116927Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:59:17.466312 containerd[1523]: time="2025-03-25T01:59:17.466286092Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:59:17.466553 containerd[1523]: time="2025-03-25T01:59:17.466521689Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:59:17.467918 containerd[1523]: time="2025-03-25T01:59:17.466652915Z" level=info msg="Start snapshots syncer" Mar 25 01:59:17.467918 containerd[1523]: time="2025-03-25T01:59:17.466739410Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:59:17.467918 containerd[1523]: time="2025-03-25T01:59:17.467282894Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467374772Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467495762Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467657604Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467692806Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467713130Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467731363Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467759168Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467780831Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467818658Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:59:17.468268 containerd[1523]: time="2025-03-25T01:59:17.467862921Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:59:17.468853 containerd[1523]: time="2025-03-25T01:59:17.468825562Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:59:17.469884 containerd[1523]: time="2025-03-25T01:59:17.469649255Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:59:17.469884 containerd[1523]: time="2025-03-25T01:59:17.469725615Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:59:17.469884 containerd[1523]: time="2025-03-25T01:59:17.469756090Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:59:17.469884 containerd[1523]: time="2025-03-25T01:59:17.469772017Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:59:17.469884 containerd[1523]: time="2025-03-25T01:59:17.469797127Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:59:17.469884 containerd[1523]: time="2025-03-25T01:59:17.469814318Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.469831402Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.470615080Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.470674642Z" level=info msg="runtime interface created" Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.470690678Z" level=info msg="created NRI interface" Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.470716884Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.470743245Z" level=info msg="Connect containerd service" Mar 25 01:59:17.472934 containerd[1523]: time="2025-03-25T01:59:17.470814036Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:59:17.474729 containerd[1523]: time="2025-03-25T01:59:17.474582293Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:59:17.495766 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:59:17.496218 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:59:17.507145 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:59:17.709076 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:59:17.719228 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:59:17.722258 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:59:17.724337 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:59:17.920808 containerd[1523]: time="2025-03-25T01:59:17.920546698Z" level=info msg="Start subscribing containerd event" Mar 25 01:59:17.920808 containerd[1523]: time="2025-03-25T01:59:17.920630647Z" level=info msg="Start recovering state" Mar 25 01:59:17.921040 containerd[1523]: time="2025-03-25T01:59:17.920880875Z" level=info msg="Start event monitor" Mar 25 01:59:17.921040 containerd[1523]: time="2025-03-25T01:59:17.920934573Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:59:17.921040 containerd[1523]: time="2025-03-25T01:59:17.920967461Z" level=info msg="Start streaming server" Mar 25 01:59:17.921040 containerd[1523]: time="2025-03-25T01:59:17.921001784Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:59:17.921040 containerd[1523]: time="2025-03-25T01:59:17.921019906Z" level=info msg="runtime interface starting up..." Mar 25 01:59:17.921040 containerd[1523]: time="2025-03-25T01:59:17.921039259Z" level=info msg="starting plugins..." Mar 25 01:59:17.921283 containerd[1523]: time="2025-03-25T01:59:17.921072411Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:59:17.921927 containerd[1523]: time="2025-03-25T01:59:17.921861249Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:59:17.925923 containerd[1523]: time="2025-03-25T01:59:17.925415976Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:59:17.935302 containerd[1523]: time="2025-03-25T01:59:17.935256914Z" level=info msg="containerd successfully booted in 0.542920s" Mar 25 01:59:17.935318 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:59:18.136920 tar[1509]: linux-amd64/README.md Mar 25 01:59:18.162724 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:59:18.650617 systemd-timesyncd[1416]: Network configuration changed, trying to establish connection. Mar 25 01:59:18.652625 systemd-networkd[1448]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8ab5:24:19ff:fee6:2ad6/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8ab5:24:19ff:fee6:2ad6/64 assigned by NDisc. Mar 25 01:59:18.652635 systemd-networkd[1448]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Mar 25 01:59:19.086505 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:59:19.091260 systemd[1]: Started sshd@0-10.230.42.214:22-139.178.68.195:38902.service - OpenSSH per-connection server daemon (139.178.68.195:38902). Mar 25 01:59:19.259311 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:59:19.274538 (kubelet)[1633]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:59:19.834119 systemd-timesyncd[1416]: Network configuration changed, trying to establish connection. Mar 25 01:59:20.064289 kubelet[1633]: E0325 01:59:20.064148 1633 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:59:20.066805 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:59:20.067095 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:59:20.067860 systemd[1]: kubelet.service: Consumed 1.813s CPU time, 251M memory peak. Mar 25 01:59:20.081666 sshd[1626]: Accepted publickey for core from 139.178.68.195 port 38902 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:20.084600 sshd-session[1626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:20.095819 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:59:20.098846 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:59:20.109881 systemd-logind[1502]: New session 1 of user core. Mar 25 01:59:20.131538 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:59:20.136935 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:59:20.152405 (systemd)[1644]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:59:20.156201 systemd-logind[1502]: New session c1 of user core. Mar 25 01:59:20.332660 systemd[1644]: Queued start job for default target default.target. Mar 25 01:59:20.340700 systemd[1644]: Created slice app.slice - User Application Slice. Mar 25 01:59:20.340745 systemd[1644]: Reached target paths.target - Paths. Mar 25 01:59:20.340815 systemd[1644]: Reached target timers.target - Timers. Mar 25 01:59:20.342868 systemd[1644]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:59:20.364362 systemd[1644]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:59:20.364733 systemd[1644]: Reached target sockets.target - Sockets. Mar 25 01:59:20.364961 systemd[1644]: Reached target basic.target - Basic System. Mar 25 01:59:20.365260 systemd[1644]: Reached target default.target - Main User Target. Mar 25 01:59:20.365296 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:59:20.365517 systemd[1644]: Startup finished in 200ms. Mar 25 01:59:20.377194 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:59:21.012856 systemd[1]: Started sshd@1-10.230.42.214:22-139.178.68.195:38918.service - OpenSSH per-connection server daemon (139.178.68.195:38918). Mar 25 01:59:21.976633 sshd[1655]: Accepted publickey for core from 139.178.68.195 port 38918 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:21.978823 sshd-session[1655]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:21.986089 systemd-logind[1502]: New session 2 of user core. Mar 25 01:59:21.992201 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:59:22.599228 sshd[1657]: Connection closed by 139.178.68.195 port 38918 Mar 25 01:59:22.600135 sshd-session[1655]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:22.604204 systemd[1]: sshd@1-10.230.42.214:22-139.178.68.195:38918.service: Deactivated successfully. Mar 25 01:59:22.606638 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:59:22.608833 systemd-logind[1502]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:59:22.610310 systemd-logind[1502]: Removed session 2. Mar 25 01:59:22.756338 systemd[1]: Started sshd@2-10.230.42.214:22-139.178.68.195:38934.service - OpenSSH per-connection server daemon (139.178.68.195:38934). Mar 25 01:59:22.838039 login[1615]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 01:59:22.846963 systemd-logind[1502]: New session 3 of user core. Mar 25 01:59:22.849541 login[1614]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 25 01:59:22.851438 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:59:22.864105 systemd-logind[1502]: New session 4 of user core. Mar 25 01:59:22.866758 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:59:23.449773 coreos-metadata[1492]: Mar 25 01:59:23.449 WARN failed to locate config-drive, using the metadata service API instead Mar 25 01:59:23.477158 coreos-metadata[1492]: Mar 25 01:59:23.477 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Mar 25 01:59:23.485602 coreos-metadata[1492]: Mar 25 01:59:23.485 INFO Fetch failed with 404: resource not found Mar 25 01:59:23.485602 coreos-metadata[1492]: Mar 25 01:59:23.485 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 25 01:59:23.486195 coreos-metadata[1492]: Mar 25 01:59:23.486 INFO Fetch successful Mar 25 01:59:23.486195 coreos-metadata[1492]: Mar 25 01:59:23.486 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Mar 25 01:59:23.496938 coreos-metadata[1492]: Mar 25 01:59:23.496 INFO Fetch successful Mar 25 01:59:23.497079 coreos-metadata[1492]: Mar 25 01:59:23.496 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Mar 25 01:59:23.512240 coreos-metadata[1492]: Mar 25 01:59:23.512 INFO Fetch successful Mar 25 01:59:23.512404 coreos-metadata[1492]: Mar 25 01:59:23.512 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Mar 25 01:59:23.529034 coreos-metadata[1492]: Mar 25 01:59:23.529 INFO Fetch successful Mar 25 01:59:23.529104 coreos-metadata[1492]: Mar 25 01:59:23.529 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Mar 25 01:59:23.550004 coreos-metadata[1492]: Mar 25 01:59:23.548 INFO Fetch successful Mar 25 01:59:23.584745 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:59:23.586424 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:59:23.714507 sshd[1663]: Accepted publickey for core from 139.178.68.195 port 38934 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:23.716521 sshd-session[1663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:23.723190 systemd-logind[1502]: New session 5 of user core. Mar 25 01:59:23.734332 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:59:23.974866 coreos-metadata[1553]: Mar 25 01:59:23.974 WARN failed to locate config-drive, using the metadata service API instead Mar 25 01:59:23.996285 coreos-metadata[1553]: Mar 25 01:59:23.996 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 25 01:59:24.018156 coreos-metadata[1553]: Mar 25 01:59:24.018 INFO Fetch successful Mar 25 01:59:24.018376 coreos-metadata[1553]: Mar 25 01:59:24.018 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 25 01:59:24.045234 coreos-metadata[1553]: Mar 25 01:59:24.045 INFO Fetch successful Mar 25 01:59:24.047613 unknown[1553]: wrote ssh authorized keys file for user: core Mar 25 01:59:24.072589 update-ssh-keys[1699]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:59:24.073614 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:59:24.076273 systemd[1]: Finished sshkeys.service. Mar 25 01:59:24.079407 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:59:24.079716 systemd[1]: Startup finished in 1.249s (kernel) + 14.433s (initrd) + 11.946s (userspace) = 27.628s. Mar 25 01:59:24.333478 sshd[1696]: Connection closed by 139.178.68.195 port 38934 Mar 25 01:59:24.333170 sshd-session[1663]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:24.337927 systemd[1]: sshd@2-10.230.42.214:22-139.178.68.195:38934.service: Deactivated successfully. Mar 25 01:59:24.340547 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:59:24.342492 systemd-logind[1502]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:59:24.344184 systemd-logind[1502]: Removed session 5. Mar 25 01:59:30.318006 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:59:30.321378 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:59:30.604108 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:59:30.617393 (kubelet)[1715]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:59:30.688623 kubelet[1715]: E0325 01:59:30.688513 1715 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:59:30.692234 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:59:30.692647 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:59:30.693320 systemd[1]: kubelet.service: Consumed 333ms CPU time, 103.8M memory peak. Mar 25 01:59:34.491208 systemd[1]: Started sshd@3-10.230.42.214:22-139.178.68.195:58768.service - OpenSSH per-connection server daemon (139.178.68.195:58768). Mar 25 01:59:35.393265 sshd[1724]: Accepted publickey for core from 139.178.68.195 port 58768 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:35.395117 sshd-session[1724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:35.403175 systemd-logind[1502]: New session 6 of user core. Mar 25 01:59:35.409154 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:59:36.012465 sshd[1726]: Connection closed by 139.178.68.195 port 58768 Mar 25 01:59:36.012224 sshd-session[1724]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:36.017512 systemd[1]: sshd@3-10.230.42.214:22-139.178.68.195:58768.service: Deactivated successfully. Mar 25 01:59:36.019784 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:59:36.020702 systemd-logind[1502]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:59:36.022334 systemd-logind[1502]: Removed session 6. Mar 25 01:59:36.170223 systemd[1]: Started sshd@4-10.230.42.214:22-139.178.68.195:58774.service - OpenSSH per-connection server daemon (139.178.68.195:58774). Mar 25 01:59:37.072677 sshd[1732]: Accepted publickey for core from 139.178.68.195 port 58774 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:37.074574 sshd-session[1732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:37.081801 systemd-logind[1502]: New session 7 of user core. Mar 25 01:59:37.091107 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:59:37.687191 sshd[1734]: Connection closed by 139.178.68.195 port 58774 Mar 25 01:59:37.687058 sshd-session[1732]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:37.690601 systemd-logind[1502]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:59:37.691229 systemd[1]: sshd@4-10.230.42.214:22-139.178.68.195:58774.service: Deactivated successfully. Mar 25 01:59:37.693234 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:59:37.695227 systemd-logind[1502]: Removed session 7. Mar 25 01:59:37.842239 systemd[1]: Started sshd@5-10.230.42.214:22-139.178.68.195:58788.service - OpenSSH per-connection server daemon (139.178.68.195:58788). Mar 25 01:59:38.746072 sshd[1740]: Accepted publickey for core from 139.178.68.195 port 58788 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:38.747847 sshd-session[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:38.754218 systemd-logind[1502]: New session 8 of user core. Mar 25 01:59:38.759071 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:59:39.365239 sshd[1742]: Connection closed by 139.178.68.195 port 58788 Mar 25 01:59:39.366187 sshd-session[1740]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:39.370681 systemd[1]: sshd@5-10.230.42.214:22-139.178.68.195:58788.service: Deactivated successfully. Mar 25 01:59:39.372794 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:59:39.374037 systemd-logind[1502]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:59:39.375201 systemd-logind[1502]: Removed session 8. Mar 25 01:59:39.523555 systemd[1]: Started sshd@6-10.230.42.214:22-139.178.68.195:58792.service - OpenSSH per-connection server daemon (139.178.68.195:58792). Mar 25 01:59:40.434832 sshd[1748]: Accepted publickey for core from 139.178.68.195 port 58792 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:40.436991 sshd-session[1748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:40.444225 systemd-logind[1502]: New session 9 of user core. Mar 25 01:59:40.456244 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:59:40.701250 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:59:40.703980 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:59:40.862344 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:59:40.874333 (kubelet)[1759]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:59:40.999506 sudo[1764]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:59:41.000608 sudo[1764]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:59:41.012715 sudo[1764]: pam_unix(sudo:session): session closed for user root Mar 25 01:59:41.031622 kubelet[1759]: E0325 01:59:41.031488 1759 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:59:41.034749 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:59:41.035016 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:59:41.035494 systemd[1]: kubelet.service: Consumed 195ms CPU time, 101.6M memory peak. Mar 25 01:59:41.155918 sshd[1750]: Connection closed by 139.178.68.195 port 58792 Mar 25 01:59:41.156469 sshd-session[1748]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:41.161020 systemd-logind[1502]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:59:41.161348 systemd[1]: sshd@6-10.230.42.214:22-139.178.68.195:58792.service: Deactivated successfully. Mar 25 01:59:41.163514 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:59:41.165484 systemd-logind[1502]: Removed session 9. Mar 25 01:59:41.314484 systemd[1]: Started sshd@7-10.230.42.214:22-139.178.68.195:58796.service - OpenSSH per-connection server daemon (139.178.68.195:58796). Mar 25 01:59:42.234009 sshd[1772]: Accepted publickey for core from 139.178.68.195 port 58796 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:42.236092 sshd-session[1772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:42.244261 systemd-logind[1502]: New session 10 of user core. Mar 25 01:59:42.254218 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:59:42.714130 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:59:42.715209 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:59:42.720109 sudo[1776]: pam_unix(sudo:session): session closed for user root Mar 25 01:59:42.728334 sudo[1775]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:59:42.728770 sudo[1775]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:59:42.741649 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:59:42.793012 augenrules[1798]: No rules Mar 25 01:59:42.794383 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:59:42.794734 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:59:42.795832 sudo[1775]: pam_unix(sudo:session): session closed for user root Mar 25 01:59:42.940693 sshd[1774]: Connection closed by 139.178.68.195 port 58796 Mar 25 01:59:42.940568 sshd-session[1772]: pam_unix(sshd:session): session closed for user core Mar 25 01:59:42.944979 systemd[1]: sshd@7-10.230.42.214:22-139.178.68.195:58796.service: Deactivated successfully. Mar 25 01:59:42.947056 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:59:42.948551 systemd-logind[1502]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:59:42.949941 systemd-logind[1502]: Removed session 10. Mar 25 01:59:43.094012 systemd[1]: Started sshd@8-10.230.42.214:22-139.178.68.195:58800.service - OpenSSH per-connection server daemon (139.178.68.195:58800). Mar 25 01:59:44.002166 sshd[1807]: Accepted publickey for core from 139.178.68.195 port 58800 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 01:59:44.004018 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:59:44.012763 systemd-logind[1502]: New session 11 of user core. Mar 25 01:59:44.023161 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:59:44.478963 sudo[1810]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:59:44.479381 sudo[1810]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:59:45.307178 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:59:45.319624 (dockerd)[1828]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:59:46.092070 dockerd[1828]: time="2025-03-25T01:59:46.091949626Z" level=info msg="Starting up" Mar 25 01:59:46.096750 dockerd[1828]: time="2025-03-25T01:59:46.096067783Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:59:46.183474 dockerd[1828]: time="2025-03-25T01:59:46.183387952Z" level=info msg="Loading containers: start." Mar 25 01:59:46.381210 kernel: Initializing XFRM netlink socket Mar 25 01:59:46.385724 systemd-timesyncd[1416]: Network configuration changed, trying to establish connection. Mar 25 01:59:46.489406 systemd-networkd[1448]: docker0: Link UP Mar 25 01:59:46.548768 dockerd[1828]: time="2025-03-25T01:59:46.548602672Z" level=info msg="Loading containers: done." Mar 25 01:59:46.573295 dockerd[1828]: time="2025-03-25T01:59:46.572383041Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:59:46.573295 dockerd[1828]: time="2025-03-25T01:59:46.572538612Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:59:46.573295 dockerd[1828]: time="2025-03-25T01:59:46.572730667Z" level=info msg="Daemon has completed initialization" Mar 25 01:59:46.614497 dockerd[1828]: time="2025-03-25T01:59:46.614298293Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:59:46.614581 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:59:47.607663 systemd-resolved[1395]: Clock change detected. Flushing caches. Mar 25 01:59:47.608336 systemd-timesyncd[1416]: Contacted time server [2a03:b0c0:1:d0::1f9:f001]:123 (2.flatcar.pool.ntp.org). Mar 25 01:59:47.608447 systemd-timesyncd[1416]: Initial clock synchronization to Tue 2025-03-25 01:59:47.607254 UTC. Mar 25 01:59:48.616784 containerd[1523]: time="2025-03-25T01:59:48.616603718Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\"" Mar 25 01:59:49.509388 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1851533508.mount: Deactivated successfully. Mar 25 01:59:49.610691 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 25 01:59:51.736349 containerd[1523]: time="2025-03-25T01:59:51.736230275Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:51.737928 containerd[1523]: time="2025-03-25T01:59:51.737818102Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.3: active requests=0, bytes read=28682438" Mar 25 01:59:51.738764 containerd[1523]: time="2025-03-25T01:59:51.738689928Z" level=info msg="ImageCreate event name:\"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:51.742475 containerd[1523]: time="2025-03-25T01:59:51.742407321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:51.744263 containerd[1523]: time="2025-03-25T01:59:51.743872917Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.3\" with image id \"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\", size \"28679230\" in 3.127097354s" Mar 25 01:59:51.744263 containerd[1523]: time="2025-03-25T01:59:51.743939338Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\" returns image reference \"sha256:f8bdc4cfa0651e2d7edb4678d2b90129aef82a19249b37dc8d4705e8bd604295\"" Mar 25 01:59:51.745311 containerd[1523]: time="2025-03-25T01:59:51.745258777Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\"" Mar 25 01:59:52.117926 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:59:52.121644 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:59:52.457168 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:59:52.459083 (kubelet)[2096]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:59:52.562186 kubelet[2096]: E0325 01:59:52.562063 2096 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:59:52.566344 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:59:52.567033 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:59:52.567864 systemd[1]: kubelet.service: Consumed 367ms CPU time, 101.8M memory peak. Mar 25 01:59:54.344337 containerd[1523]: time="2025-03-25T01:59:54.344188045Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:54.346429 containerd[1523]: time="2025-03-25T01:59:54.346259021Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.3: active requests=0, bytes read=24779692" Mar 25 01:59:54.347460 containerd[1523]: time="2025-03-25T01:59:54.347004780Z" level=info msg="ImageCreate event name:\"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:54.351295 containerd[1523]: time="2025-03-25T01:59:54.351228454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:54.353604 containerd[1523]: time="2025-03-25T01:59:54.353030978Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.3\" with image id \"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\", size \"26267292\" in 2.607721703s" Mar 25 01:59:54.353604 containerd[1523]: time="2025-03-25T01:59:54.353153581Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\" returns image reference \"sha256:085818208a5213f37ef6d103caaf8e1e243816a614eb5b87a98bfffe79c687b5\"" Mar 25 01:59:54.356077 containerd[1523]: time="2025-03-25T01:59:54.356040565Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\"" Mar 25 01:59:56.385635 containerd[1523]: time="2025-03-25T01:59:56.384531068Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:56.386922 containerd[1523]: time="2025-03-25T01:59:56.386829559Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.3: active requests=0, bytes read=19171427" Mar 25 01:59:56.388103 containerd[1523]: time="2025-03-25T01:59:56.388039029Z" level=info msg="ImageCreate event name:\"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:56.391680 containerd[1523]: time="2025-03-25T01:59:56.391599920Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:56.393572 containerd[1523]: time="2025-03-25T01:59:56.393330735Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.3\" with image id \"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\", size \"20659045\" in 2.037241474s" Mar 25 01:59:56.393572 containerd[1523]: time="2025-03-25T01:59:56.393379090Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\" returns image reference \"sha256:b4260bf5078ab1b01dd05fb05015fc436b7100b7b9b5ea738e247a86008b16b8\"" Mar 25 01:59:56.394624 containerd[1523]: time="2025-03-25T01:59:56.394494644Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 25 01:59:58.232271 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1558169461.mount: Deactivated successfully. Mar 25 01:59:59.213147 containerd[1523]: time="2025-03-25T01:59:59.213028481Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:59.214658 containerd[1523]: time="2025-03-25T01:59:59.214576393Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=30918193" Mar 25 01:59:59.215617 containerd[1523]: time="2025-03-25T01:59:59.215550197Z" level=info msg="ImageCreate event name:\"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:59.218154 containerd[1523]: time="2025-03-25T01:59:59.218066234Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:59:59.219818 containerd[1523]: time="2025-03-25T01:59:59.219276145Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"30917204\" in 2.824731457s" Mar 25 01:59:59.219818 containerd[1523]: time="2025-03-25T01:59:59.219381471Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:a1ae78fd2f9d8fc345928378dc947c7f1e95f01c1a552781827071867a95d09c\"" Mar 25 01:59:59.222428 containerd[1523]: time="2025-03-25T01:59:59.222083001Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Mar 25 01:59:59.854289 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1188407326.mount: Deactivated successfully. Mar 25 02:00:01.422863 containerd[1523]: time="2025-03-25T02:00:01.422687499Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:01.424876 containerd[1523]: time="2025-03-25T02:00:01.424787342Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" Mar 25 02:00:01.425563 containerd[1523]: time="2025-03-25T02:00:01.425178217Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:01.428788 containerd[1523]: time="2025-03-25T02:00:01.428718756Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:01.430413 containerd[1523]: time="2025-03-25T02:00:01.430173980Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.20804796s" Mar 25 02:00:01.430413 containerd[1523]: time="2025-03-25T02:00:01.430228758Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Mar 25 02:00:01.431684 containerd[1523]: time="2025-03-25T02:00:01.431421927Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 02:00:02.073945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount55167427.mount: Deactivated successfully. Mar 25 02:00:02.100570 containerd[1523]: time="2025-03-25T02:00:02.099038109Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 02:00:02.108046 containerd[1523]: time="2025-03-25T02:00:02.107810914Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Mar 25 02:00:02.128719 containerd[1523]: time="2025-03-25T02:00:02.128638495Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 02:00:02.133661 containerd[1523]: time="2025-03-25T02:00:02.133523881Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 02:00:02.135266 containerd[1523]: time="2025-03-25T02:00:02.134656471Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 703.183529ms" Mar 25 02:00:02.135266 containerd[1523]: time="2025-03-25T02:00:02.134794723Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 25 02:00:02.136326 containerd[1523]: time="2025-03-25T02:00:02.136183268Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Mar 25 02:00:02.370223 update_engine[1504]: I20250325 02:00:02.369944 1504 update_attempter.cc:509] Updating boot flags... Mar 25 02:00:02.425596 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2182) Mar 25 02:00:02.547861 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2182) Mar 25 02:00:02.584514 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 25 02:00:02.610827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:00:03.040396 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4068810044.mount: Deactivated successfully. Mar 25 02:00:03.055713 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:00:03.071219 (kubelet)[2201]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 02:00:03.187674 kubelet[2201]: E0325 02:00:03.186450 2201 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 02:00:03.191358 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 02:00:03.191650 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 02:00:03.192869 systemd[1]: kubelet.service: Consumed 410ms CPU time, 106M memory peak. Mar 25 02:00:06.424367 containerd[1523]: time="2025-03-25T02:00:06.422824151Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:06.425858 containerd[1523]: time="2025-03-25T02:00:06.425794799Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551328" Mar 25 02:00:06.426941 containerd[1523]: time="2025-03-25T02:00:06.426885519Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:06.430531 containerd[1523]: time="2025-03-25T02:00:06.430498073Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:06.432284 containerd[1523]: time="2025-03-25T02:00:06.432063823Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 4.295430845s" Mar 25 02:00:06.432284 containerd[1523]: time="2025-03-25T02:00:06.432112859Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Mar 25 02:00:11.861932 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:00:11.863005 systemd[1]: kubelet.service: Consumed 410ms CPU time, 106M memory peak. Mar 25 02:00:11.868585 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:00:11.916266 systemd[1]: Reload requested from client PID 2286 ('systemctl') (unit session-11.scope)... Mar 25 02:00:11.916745 systemd[1]: Reloading... Mar 25 02:00:12.161583 zram_generator::config[2332]: No configuration found. Mar 25 02:00:12.376125 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 02:00:12.527598 systemd[1]: Reloading finished in 610 ms. Mar 25 02:00:12.614530 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 25 02:00:12.614742 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 25 02:00:12.615163 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:00:12.615240 systemd[1]: kubelet.service: Consumed 365ms CPU time, 91.7M memory peak. Mar 25 02:00:12.619809 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:00:12.865412 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:00:12.876172 (kubelet)[2400]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 02:00:12.983617 kubelet[2400]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:00:12.983617 kubelet[2400]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 02:00:12.983617 kubelet[2400]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:00:12.983617 kubelet[2400]: I0325 02:00:12.983127 2400 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 02:00:14.121735 kubelet[2400]: I0325 02:00:14.121657 2400 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 02:00:14.121735 kubelet[2400]: I0325 02:00:14.121718 2400 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 02:00:14.122490 kubelet[2400]: I0325 02:00:14.122118 2400 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 02:00:14.153488 kubelet[2400]: E0325 02:00:14.153390 2400 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.42.214:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:14.154114 kubelet[2400]: I0325 02:00:14.153858 2400 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 02:00:14.176837 kubelet[2400]: I0325 02:00:14.176782 2400 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 02:00:14.186351 kubelet[2400]: I0325 02:00:14.185748 2400 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 02:00:14.190501 kubelet[2400]: I0325 02:00:14.190396 2400 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 02:00:14.190910 kubelet[2400]: I0325 02:00:14.190483 2400 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-u0apo.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 02:00:14.192550 kubelet[2400]: I0325 02:00:14.192496 2400 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 02:00:14.192550 kubelet[2400]: I0325 02:00:14.192527 2400 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 02:00:14.192890 kubelet[2400]: I0325 02:00:14.192856 2400 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:00:14.198799 kubelet[2400]: I0325 02:00:14.198768 2400 kubelet.go:446] "Attempting to sync node with API server" Mar 25 02:00:14.198799 kubelet[2400]: I0325 02:00:14.198803 2400 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 02:00:14.198944 kubelet[2400]: I0325 02:00:14.198861 2400 kubelet.go:352] "Adding apiserver pod source" Mar 25 02:00:14.198944 kubelet[2400]: I0325 02:00:14.198894 2400 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 02:00:14.211575 kubelet[2400]: I0325 02:00:14.211276 2400 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 02:00:14.212658 kubelet[2400]: W0325 02:00:14.212278 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.42.214:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-u0apo.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:14.212658 kubelet[2400]: E0325 02:00:14.212408 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.42.214:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-u0apo.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:14.212658 kubelet[2400]: W0325 02:00:14.212525 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.42.214:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:14.212658 kubelet[2400]: E0325 02:00:14.212593 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.42.214:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:14.214852 kubelet[2400]: I0325 02:00:14.214824 2400 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 02:00:14.215673 kubelet[2400]: W0325 02:00:14.215642 2400 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 02:00:14.218568 kubelet[2400]: I0325 02:00:14.217005 2400 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 02:00:14.218568 kubelet[2400]: I0325 02:00:14.217083 2400 server.go:1287] "Started kubelet" Mar 25 02:00:14.218568 kubelet[2400]: I0325 02:00:14.217825 2400 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 02:00:14.221216 kubelet[2400]: I0325 02:00:14.221192 2400 server.go:490] "Adding debug handlers to kubelet server" Mar 25 02:00:14.221634 kubelet[2400]: I0325 02:00:14.221521 2400 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 02:00:14.222291 kubelet[2400]: I0325 02:00:14.222263 2400 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 02:00:14.226851 kubelet[2400]: E0325 02:00:14.223268 2400 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.42.214:6443/api/v1/namespaces/default/events\": dial tcp 10.230.42.214:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-u0apo.gb1.brightbox.com.182fe9398dcc019c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-u0apo.gb1.brightbox.com,UID:srv-u0apo.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-u0apo.gb1.brightbox.com,},FirstTimestamp:2025-03-25 02:00:14.217036188 +0000 UTC m=+1.321598100,LastTimestamp:2025-03-25 02:00:14.217036188 +0000 UTC m=+1.321598100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-u0apo.gb1.brightbox.com,}" Mar 25 02:00:14.230462 kubelet[2400]: I0325 02:00:14.230417 2400 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 02:00:14.230735 kubelet[2400]: I0325 02:00:14.230706 2400 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 02:00:14.238395 kubelet[2400]: I0325 02:00:14.238351 2400 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 02:00:14.238661 kubelet[2400]: E0325 02:00:14.238624 2400 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"srv-u0apo.gb1.brightbox.com\" not found" Mar 25 02:00:14.241211 kubelet[2400]: I0325 02:00:14.241181 2400 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 02:00:14.241314 kubelet[2400]: I0325 02:00:14.241299 2400 reconciler.go:26] "Reconciler: start to sync state" Mar 25 02:00:14.241870 kubelet[2400]: W0325 02:00:14.241809 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.42.214:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:14.241967 kubelet[2400]: E0325 02:00:14.241884 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.42.214:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:14.242028 kubelet[2400]: E0325 02:00:14.241967 2400 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.42.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-u0apo.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.42.214:6443: connect: connection refused" interval="200ms" Mar 25 02:00:14.244028 kubelet[2400]: I0325 02:00:14.243695 2400 factory.go:221] Registration of the systemd container factory successfully Mar 25 02:00:14.244028 kubelet[2400]: I0325 02:00:14.243828 2400 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 02:00:14.249505 kubelet[2400]: E0325 02:00:14.249467 2400 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 02:00:14.251838 kubelet[2400]: I0325 02:00:14.251810 2400 factory.go:221] Registration of the containerd container factory successfully Mar 25 02:00:14.283591 kubelet[2400]: I0325 02:00:14.283405 2400 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 02:00:14.285518 kubelet[2400]: I0325 02:00:14.285491 2400 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 02:00:14.285632 kubelet[2400]: I0325 02:00:14.285595 2400 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 02:00:14.285690 kubelet[2400]: I0325 02:00:14.285648 2400 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 02:00:14.285690 kubelet[2400]: I0325 02:00:14.285670 2400 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 02:00:14.285879 kubelet[2400]: E0325 02:00:14.285768 2400 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 02:00:14.292960 kubelet[2400]: W0325 02:00:14.292720 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.42.214:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:14.292960 kubelet[2400]: E0325 02:00:14.292796 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.42.214:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:14.310569 kubelet[2400]: I0325 02:00:14.309675 2400 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 02:00:14.310569 kubelet[2400]: I0325 02:00:14.309705 2400 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 02:00:14.310569 kubelet[2400]: I0325 02:00:14.309768 2400 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:00:14.314566 kubelet[2400]: I0325 02:00:14.314180 2400 policy_none.go:49] "None policy: Start" Mar 25 02:00:14.314566 kubelet[2400]: I0325 02:00:14.314229 2400 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 02:00:14.314566 kubelet[2400]: I0325 02:00:14.314259 2400 state_mem.go:35] "Initializing new in-memory state store" Mar 25 02:00:14.325950 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 02:00:14.340020 kubelet[2400]: E0325 02:00:14.339897 2400 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"srv-u0apo.gb1.brightbox.com\" not found" Mar 25 02:00:14.340888 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 02:00:14.348506 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 02:00:14.357566 kubelet[2400]: I0325 02:00:14.357165 2400 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 02:00:14.357780 kubelet[2400]: I0325 02:00:14.357753 2400 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 02:00:14.357992 kubelet[2400]: I0325 02:00:14.357898 2400 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 02:00:14.360164 kubelet[2400]: I0325 02:00:14.360032 2400 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 02:00:14.363289 kubelet[2400]: E0325 02:00:14.363265 2400 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 02:00:14.363645 kubelet[2400]: E0325 02:00:14.363622 2400 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-u0apo.gb1.brightbox.com\" not found" Mar 25 02:00:14.402913 systemd[1]: Created slice kubepods-burstable-pod971b9e87a2f03011a48bf7d6252c33a1.slice - libcontainer container kubepods-burstable-pod971b9e87a2f03011a48bf7d6252c33a1.slice. Mar 25 02:00:14.415583 kubelet[2400]: E0325 02:00:14.413846 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.418932 systemd[1]: Created slice kubepods-burstable-pod60a410da398007b5894be16394648cfc.slice - libcontainer container kubepods-burstable-pod60a410da398007b5894be16394648cfc.slice. Mar 25 02:00:14.421970 kubelet[2400]: E0325 02:00:14.421745 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.433980 systemd[1]: Created slice kubepods-burstable-pod43d827cf4e04a31afd9da178160e6e5b.slice - libcontainer container kubepods-burstable-pod43d827cf4e04a31afd9da178160e6e5b.slice. Mar 25 02:00:14.436750 kubelet[2400]: E0325 02:00:14.436691 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.442814 kubelet[2400]: I0325 02:00:14.442771 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.442956 kubelet[2400]: I0325 02:00:14.442830 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60a410da398007b5894be16394648cfc-ca-certs\") pod \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" (UID: \"60a410da398007b5894be16394648cfc\") " pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.442956 kubelet[2400]: I0325 02:00:14.442862 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60a410da398007b5894be16394648cfc-usr-share-ca-certificates\") pod \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" (UID: \"60a410da398007b5894be16394648cfc\") " pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.442956 kubelet[2400]: I0325 02:00:14.442893 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-k8s-certs\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.442956 kubelet[2400]: I0325 02:00:14.442923 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-kubeconfig\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.442956 kubelet[2400]: I0325 02:00:14.442950 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/43d827cf4e04a31afd9da178160e6e5b-kubeconfig\") pod \"kube-scheduler-srv-u0apo.gb1.brightbox.com\" (UID: \"43d827cf4e04a31afd9da178160e6e5b\") " pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.443500 kubelet[2400]: I0325 02:00:14.442975 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60a410da398007b5894be16394648cfc-k8s-certs\") pod \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" (UID: \"60a410da398007b5894be16394648cfc\") " pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.443500 kubelet[2400]: I0325 02:00:14.443002 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-ca-certs\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.443500 kubelet[2400]: I0325 02:00:14.443039 2400 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-flexvolume-dir\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.443500 kubelet[2400]: E0325 02:00:14.443313 2400 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.42.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-u0apo.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.42.214:6443: connect: connection refused" interval="400ms" Mar 25 02:00:14.467488 kubelet[2400]: I0325 02:00:14.467402 2400 kubelet_node_status.go:76] "Attempting to register node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.467990 kubelet[2400]: E0325 02:00:14.467931 2400 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.230.42.214:6443/api/v1/nodes\": dial tcp 10.230.42.214:6443: connect: connection refused" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.671985 kubelet[2400]: I0325 02:00:14.671805 2400 kubelet_node_status.go:76] "Attempting to register node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.672471 kubelet[2400]: E0325 02:00:14.672362 2400 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.230.42.214:6443/api/v1/nodes\": dial tcp 10.230.42.214:6443: connect: connection refused" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:14.718907 containerd[1523]: time="2025-03-25T02:00:14.718465869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-u0apo.gb1.brightbox.com,Uid:971b9e87a2f03011a48bf7d6252c33a1,Namespace:kube-system,Attempt:0,}" Mar 25 02:00:14.724452 containerd[1523]: time="2025-03-25T02:00:14.723929657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-u0apo.gb1.brightbox.com,Uid:60a410da398007b5894be16394648cfc,Namespace:kube-system,Attempt:0,}" Mar 25 02:00:14.738174 containerd[1523]: time="2025-03-25T02:00:14.738124956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-u0apo.gb1.brightbox.com,Uid:43d827cf4e04a31afd9da178160e6e5b,Namespace:kube-system,Attempt:0,}" Mar 25 02:00:14.844352 kubelet[2400]: E0325 02:00:14.843991 2400 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.42.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-u0apo.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.42.214:6443: connect: connection refused" interval="800ms" Mar 25 02:00:15.019790 containerd[1523]: time="2025-03-25T02:00:15.019390874Z" level=info msg="connecting to shim 43dff33dc7f4546a012db38c4a147da85858edbed699e06d90b266dfa93a866f" address="unix:///run/containerd/s/b75be1b447888481e5cd381d3b2f1f5facbe96c85fbbe0a2b4152be0fd64e53a" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:15.024574 containerd[1523]: time="2025-03-25T02:00:15.023678426Z" level=info msg="connecting to shim f6186400966202f7f15e8e0700e8686d0ddbcad3f3dcd348590b6cea4dcbac6a" address="unix:///run/containerd/s/07500eb00dc9c1cb279c4682dad4d19c4a772c0cc8de49417fcdbe359cc4b826" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:15.026075 containerd[1523]: time="2025-03-25T02:00:15.026030245Z" level=info msg="connecting to shim a7ff01ca1eff6c91dc6dfa0e75359087ca3d3960aef4f3e526b13bc20076ecac" address="unix:///run/containerd/s/f759bb9f7a2d901414b2f589f8fe4d4c642c837cd82811ce706ac23bd948e069" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:15.081266 kubelet[2400]: I0325 02:00:15.080707 2400 kubelet_node_status.go:76] "Attempting to register node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:15.081514 kubelet[2400]: E0325 02:00:15.081477 2400 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.230.42.214:6443/api/v1/nodes\": dial tcp 10.230.42.214:6443: connect: connection refused" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:15.172885 systemd[1]: Started cri-containerd-43dff33dc7f4546a012db38c4a147da85858edbed699e06d90b266dfa93a866f.scope - libcontainer container 43dff33dc7f4546a012db38c4a147da85858edbed699e06d90b266dfa93a866f. Mar 25 02:00:15.176874 systemd[1]: Started cri-containerd-a7ff01ca1eff6c91dc6dfa0e75359087ca3d3960aef4f3e526b13bc20076ecac.scope - libcontainer container a7ff01ca1eff6c91dc6dfa0e75359087ca3d3960aef4f3e526b13bc20076ecac. Mar 25 02:00:15.181109 systemd[1]: Started cri-containerd-f6186400966202f7f15e8e0700e8686d0ddbcad3f3dcd348590b6cea4dcbac6a.scope - libcontainer container f6186400966202f7f15e8e0700e8686d0ddbcad3f3dcd348590b6cea4dcbac6a. Mar 25 02:00:15.321382 containerd[1523]: time="2025-03-25T02:00:15.320874809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-u0apo.gb1.brightbox.com,Uid:43d827cf4e04a31afd9da178160e6e5b,Namespace:kube-system,Attempt:0,} returns sandbox id \"f6186400966202f7f15e8e0700e8686d0ddbcad3f3dcd348590b6cea4dcbac6a\"" Mar 25 02:00:15.334571 containerd[1523]: time="2025-03-25T02:00:15.333266241Z" level=info msg="CreateContainer within sandbox \"f6186400966202f7f15e8e0700e8686d0ddbcad3f3dcd348590b6cea4dcbac6a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 02:00:15.337145 containerd[1523]: time="2025-03-25T02:00:15.337063105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-u0apo.gb1.brightbox.com,Uid:60a410da398007b5894be16394648cfc,Namespace:kube-system,Attempt:0,} returns sandbox id \"a7ff01ca1eff6c91dc6dfa0e75359087ca3d3960aef4f3e526b13bc20076ecac\"" Mar 25 02:00:15.343701 containerd[1523]: time="2025-03-25T02:00:15.343659590Z" level=info msg="CreateContainer within sandbox \"a7ff01ca1eff6c91dc6dfa0e75359087ca3d3960aef4f3e526b13bc20076ecac\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 02:00:15.350936 kubelet[2400]: W0325 02:00:15.350870 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.42.214:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:15.351633 kubelet[2400]: E0325 02:00:15.350982 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.42.214:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:15.364921 containerd[1523]: time="2025-03-25T02:00:15.364244807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-u0apo.gb1.brightbox.com,Uid:971b9e87a2f03011a48bf7d6252c33a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"43dff33dc7f4546a012db38c4a147da85858edbed699e06d90b266dfa93a866f\"" Mar 25 02:00:15.367706 containerd[1523]: time="2025-03-25T02:00:15.367648711Z" level=info msg="Container b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:15.369864 containerd[1523]: time="2025-03-25T02:00:15.369809583Z" level=info msg="Container 50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:15.372576 containerd[1523]: time="2025-03-25T02:00:15.372101935Z" level=info msg="CreateContainer within sandbox \"43dff33dc7f4546a012db38c4a147da85858edbed699e06d90b266dfa93a866f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 02:00:15.383171 containerd[1523]: time="2025-03-25T02:00:15.382919707Z" level=info msg="CreateContainer within sandbox \"f6186400966202f7f15e8e0700e8686d0ddbcad3f3dcd348590b6cea4dcbac6a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f\"" Mar 25 02:00:15.384760 containerd[1523]: time="2025-03-25T02:00:15.384713889Z" level=info msg="StartContainer for \"b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f\"" Mar 25 02:00:15.390243 containerd[1523]: time="2025-03-25T02:00:15.390160650Z" level=info msg="CreateContainer within sandbox \"a7ff01ca1eff6c91dc6dfa0e75359087ca3d3960aef4f3e526b13bc20076ecac\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4\"" Mar 25 02:00:15.391392 containerd[1523]: time="2025-03-25T02:00:15.391359161Z" level=info msg="StartContainer for \"50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4\"" Mar 25 02:00:15.393110 containerd[1523]: time="2025-03-25T02:00:15.392331415Z" level=info msg="connecting to shim b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f" address="unix:///run/containerd/s/07500eb00dc9c1cb279c4682dad4d19c4a772c0cc8de49417fcdbe359cc4b826" protocol=ttrpc version=3 Mar 25 02:00:15.393187 containerd[1523]: time="2025-03-25T02:00:15.393150473Z" level=info msg="connecting to shim 50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4" address="unix:///run/containerd/s/f759bb9f7a2d901414b2f589f8fe4d4c642c837cd82811ce706ac23bd948e069" protocol=ttrpc version=3 Mar 25 02:00:15.395912 containerd[1523]: time="2025-03-25T02:00:15.395882967Z" level=info msg="Container 72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:15.406762 containerd[1523]: time="2025-03-25T02:00:15.406592288Z" level=info msg="CreateContainer within sandbox \"43dff33dc7f4546a012db38c4a147da85858edbed699e06d90b266dfa93a866f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da\"" Mar 25 02:00:15.408997 containerd[1523]: time="2025-03-25T02:00:15.408952641Z" level=info msg="StartContainer for \"72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da\"" Mar 25 02:00:15.409366 kubelet[2400]: W0325 02:00:15.408722 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.42.214:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-u0apo.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:15.409506 kubelet[2400]: E0325 02:00:15.409407 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.42.214:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-u0apo.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:15.414306 containerd[1523]: time="2025-03-25T02:00:15.414260210Z" level=info msg="connecting to shim 72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da" address="unix:///run/containerd/s/b75be1b447888481e5cd381d3b2f1f5facbe96c85fbbe0a2b4152be0fd64e53a" protocol=ttrpc version=3 Mar 25 02:00:15.433758 systemd[1]: Started cri-containerd-50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4.scope - libcontainer container 50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4. Mar 25 02:00:15.458734 systemd[1]: Started cri-containerd-b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f.scope - libcontainer container b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f. Mar 25 02:00:15.464445 systemd[1]: Started cri-containerd-72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da.scope - libcontainer container 72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da. Mar 25 02:00:15.571677 containerd[1523]: time="2025-03-25T02:00:15.570298597Z" level=info msg="StartContainer for \"50a7aefe42c94f30ce00acb5e9493c61d50716b65c8f7336d7adc74fdb1705f4\" returns successfully" Mar 25 02:00:15.619858 kubelet[2400]: W0325 02:00:15.619775 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.42.214:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:15.620061 kubelet[2400]: E0325 02:00:15.619871 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.42.214:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:15.623363 containerd[1523]: time="2025-03-25T02:00:15.623049571Z" level=info msg="StartContainer for \"b45c51c8387ef3a53678ea9673927b6f53d3081d0d1cce74f16f6140a3dee55f\" returns successfully" Mar 25 02:00:15.625496 containerd[1523]: time="2025-03-25T02:00:15.625392362Z" level=info msg="StartContainer for \"72ff152e2c46074100afafd2e7d00b3d314d34044f34708794fbdac21be9a5da\" returns successfully" Mar 25 02:00:15.645358 kubelet[2400]: E0325 02:00:15.645300 2400 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.42.214:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-u0apo.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.42.214:6443: connect: connection refused" interval="1.6s" Mar 25 02:00:15.654428 kubelet[2400]: W0325 02:00:15.654331 2400 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.42.214:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.42.214:6443: connect: connection refused Mar 25 02:00:15.654652 kubelet[2400]: E0325 02:00:15.654438 2400 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.42.214:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.42.214:6443: connect: connection refused" logger="UnhandledError" Mar 25 02:00:15.765717 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1747988231.mount: Deactivated successfully. Mar 25 02:00:15.884934 kubelet[2400]: I0325 02:00:15.884889 2400 kubelet_node_status.go:76] "Attempting to register node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:15.885442 kubelet[2400]: E0325 02:00:15.885396 2400 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.230.42.214:6443/api/v1/nodes\": dial tcp 10.230.42.214:6443: connect: connection refused" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:16.321863 kubelet[2400]: E0325 02:00:16.320377 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:16.333662 kubelet[2400]: E0325 02:00:16.333598 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:16.339674 kubelet[2400]: E0325 02:00:16.339638 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:17.349613 kubelet[2400]: E0325 02:00:17.349160 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:17.352094 kubelet[2400]: E0325 02:00:17.349534 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:17.352094 kubelet[2400]: E0325 02:00:17.349723 2400 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:17.490582 kubelet[2400]: I0325 02:00:17.490467 2400 kubelet_node_status.go:76] "Attempting to register node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:18.984910 kubelet[2400]: E0325 02:00:18.984788 2400 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-u0apo.gb1.brightbox.com\" not found" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.009776 kubelet[2400]: I0325 02:00:19.009644 2400 kubelet_node_status.go:79] "Successfully registered node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.009776 kubelet[2400]: E0325 02:00:19.009739 2400 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"srv-u0apo.gb1.brightbox.com\": node \"srv-u0apo.gb1.brightbox.com\" not found" Mar 25 02:00:19.039883 kubelet[2400]: I0325 02:00:19.039840 2400 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.090290 kubelet[2400]: E0325 02:00:19.090201 2400 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.090290 kubelet[2400]: I0325 02:00:19.090246 2400 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.095197 kubelet[2400]: E0325 02:00:19.095157 2400 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.095338 kubelet[2400]: I0325 02:00:19.095195 2400 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.098358 kubelet[2400]: E0325 02:00:19.098299 2400 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-u0apo.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:19.205067 kubelet[2400]: I0325 02:00:19.205013 2400 apiserver.go:52] "Watching apiserver" Mar 25 02:00:19.241740 kubelet[2400]: I0325 02:00:19.241516 2400 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 02:00:20.418318 kubelet[2400]: I0325 02:00:20.418248 2400 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:20.427215 kubelet[2400]: W0325 02:00:20.425591 2400 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:00:21.420360 systemd[1]: Reload requested from client PID 2672 ('systemctl') (unit session-11.scope)... Mar 25 02:00:21.421301 systemd[1]: Reloading... Mar 25 02:00:21.569642 zram_generator::config[2721]: No configuration found. Mar 25 02:00:21.777671 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 02:00:21.981116 systemd[1]: Reloading finished in 558 ms. Mar 25 02:00:22.028471 kubelet[2400]: I0325 02:00:22.027795 2400 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 02:00:22.028161 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:00:22.044786 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 02:00:22.045402 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:00:22.045582 systemd[1]: kubelet.service: Consumed 2.030s CPU time, 119.9M memory peak. Mar 25 02:00:22.053901 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 02:00:22.607232 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 02:00:22.620051 (kubelet)[2782]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 02:00:22.760647 kubelet[2782]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:00:22.760647 kubelet[2782]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 02:00:22.760647 kubelet[2782]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 02:00:22.765262 kubelet[2782]: I0325 02:00:22.764602 2782 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 02:00:22.782848 kubelet[2782]: I0325 02:00:22.782788 2782 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 02:00:22.783671 kubelet[2782]: I0325 02:00:22.783079 2782 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 02:00:22.785211 kubelet[2782]: I0325 02:00:22.785188 2782 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 02:00:22.789744 kubelet[2782]: I0325 02:00:22.789714 2782 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 02:00:22.797490 kubelet[2782]: I0325 02:00:22.797451 2782 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 02:00:22.819703 kubelet[2782]: I0325 02:00:22.819634 2782 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 02:00:22.829768 kubelet[2782]: I0325 02:00:22.829711 2782 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 02:00:22.831211 kubelet[2782]: I0325 02:00:22.830520 2782 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 02:00:22.831211 kubelet[2782]: I0325 02:00:22.830601 2782 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-u0apo.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 02:00:22.831773 kubelet[2782]: I0325 02:00:22.831749 2782 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 02:00:22.831871 kubelet[2782]: I0325 02:00:22.831855 2782 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 02:00:22.832258 kubelet[2782]: I0325 02:00:22.832237 2782 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:00:22.833806 kubelet[2782]: I0325 02:00:22.833783 2782 kubelet.go:446] "Attempting to sync node with API server" Mar 25 02:00:22.834099 kubelet[2782]: I0325 02:00:22.833931 2782 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 02:00:22.837743 kubelet[2782]: I0325 02:00:22.837720 2782 kubelet.go:352] "Adding apiserver pod source" Mar 25 02:00:22.837950 kubelet[2782]: I0325 02:00:22.837927 2782 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 02:00:22.848029 kubelet[2782]: I0325 02:00:22.847992 2782 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 02:00:22.849408 kubelet[2782]: I0325 02:00:22.849208 2782 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 02:00:22.857406 kubelet[2782]: I0325 02:00:22.857360 2782 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 02:00:22.860061 kubelet[2782]: I0325 02:00:22.858000 2782 server.go:1287] "Started kubelet" Mar 25 02:00:22.863309 kubelet[2782]: I0325 02:00:22.863198 2782 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 02:00:22.870566 kubelet[2782]: I0325 02:00:22.867679 2782 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 02:00:22.870830 kubelet[2782]: I0325 02:00:22.870806 2782 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 02:00:22.871200 kubelet[2782]: I0325 02:00:22.871169 2782 server.go:490] "Adding debug handlers to kubelet server" Mar 25 02:00:22.882315 kubelet[2782]: I0325 02:00:22.882271 2782 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 02:00:22.893359 kubelet[2782]: I0325 02:00:22.892657 2782 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 02:00:22.902572 kubelet[2782]: I0325 02:00:22.901333 2782 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 02:00:22.922333 kubelet[2782]: E0325 02:00:22.922085 2782 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 02:00:22.924480 kubelet[2782]: I0325 02:00:22.924170 2782 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 02:00:22.935907 kubelet[2782]: I0325 02:00:22.934232 2782 reconciler.go:26] "Reconciler: start to sync state" Mar 25 02:00:22.938255 kubelet[2782]: I0325 02:00:22.936288 2782 factory.go:221] Registration of the systemd container factory successfully Mar 25 02:00:22.946867 kubelet[2782]: I0325 02:00:22.946083 2782 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 02:00:22.955434 kubelet[2782]: I0325 02:00:22.954793 2782 factory.go:221] Registration of the containerd container factory successfully Mar 25 02:00:22.958976 kubelet[2782]: I0325 02:00:22.958777 2782 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 02:00:22.970366 kubelet[2782]: I0325 02:00:22.970233 2782 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 02:00:22.972600 kubelet[2782]: I0325 02:00:22.971998 2782 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 02:00:22.972600 kubelet[2782]: I0325 02:00:22.972069 2782 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 02:00:22.972600 kubelet[2782]: I0325 02:00:22.972096 2782 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 02:00:22.972600 kubelet[2782]: E0325 02:00:22.972213 2782 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 02:00:23.072389 kubelet[2782]: E0325 02:00:23.072335 2782 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 25 02:00:23.086097 kubelet[2782]: I0325 02:00:23.086053 2782 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 02:00:23.086471 kubelet[2782]: I0325 02:00:23.086396 2782 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 02:00:23.086730 kubelet[2782]: I0325 02:00:23.086711 2782 state_mem.go:36] "Initialized new in-memory state store" Mar 25 02:00:23.087590 kubelet[2782]: I0325 02:00:23.087457 2782 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 02:00:23.087831 kubelet[2782]: I0325 02:00:23.087724 2782 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 02:00:23.088024 kubelet[2782]: I0325 02:00:23.088006 2782 policy_none.go:49] "None policy: Start" Mar 25 02:00:23.088273 kubelet[2782]: I0325 02:00:23.088244 2782 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 02:00:23.088467 kubelet[2782]: I0325 02:00:23.088439 2782 state_mem.go:35] "Initializing new in-memory state store" Mar 25 02:00:23.088898 kubelet[2782]: I0325 02:00:23.088877 2782 state_mem.go:75] "Updated machine memory state" Mar 25 02:00:23.107868 kubelet[2782]: I0325 02:00:23.106680 2782 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 02:00:23.107868 kubelet[2782]: I0325 02:00:23.107116 2782 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 02:00:23.107868 kubelet[2782]: I0325 02:00:23.107182 2782 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 02:00:23.110280 kubelet[2782]: I0325 02:00:23.109369 2782 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 02:00:23.121281 kubelet[2782]: E0325 02:00:23.121194 2782 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 02:00:23.284907 kubelet[2782]: I0325 02:00:23.282740 2782 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.289777 kubelet[2782]: I0325 02:00:23.288751 2782 kubelet_node_status.go:76] "Attempting to register node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.292985 kubelet[2782]: I0325 02:00:23.292376 2782 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.295062 kubelet[2782]: I0325 02:00:23.294760 2782 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.319174 kubelet[2782]: W0325 02:00:23.314244 2782 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:00:23.338740 kubelet[2782]: I0325 02:00:23.338603 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-ca-certs\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.339222 kubelet[2782]: I0325 02:00:23.338976 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-kubeconfig\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.339727 kubelet[2782]: I0325 02:00:23.339455 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.339727 kubelet[2782]: I0325 02:00:23.339594 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/43d827cf4e04a31afd9da178160e6e5b-kubeconfig\") pod \"kube-scheduler-srv-u0apo.gb1.brightbox.com\" (UID: \"43d827cf4e04a31afd9da178160e6e5b\") " pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.339727 kubelet[2782]: I0325 02:00:23.339628 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60a410da398007b5894be16394648cfc-ca-certs\") pod \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" (UID: \"60a410da398007b5894be16394648cfc\") " pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.341594 kubelet[2782]: I0325 02:00:23.339989 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-flexvolume-dir\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.341594 kubelet[2782]: I0325 02:00:23.340090 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/971b9e87a2f03011a48bf7d6252c33a1-k8s-certs\") pod \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" (UID: \"971b9e87a2f03011a48bf7d6252c33a1\") " pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.341594 kubelet[2782]: I0325 02:00:23.340181 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60a410da398007b5894be16394648cfc-k8s-certs\") pod \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" (UID: \"60a410da398007b5894be16394648cfc\") " pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.342062 kubelet[2782]: I0325 02:00:23.341731 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60a410da398007b5894be16394648cfc-usr-share-ca-certificates\") pod \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" (UID: \"60a410da398007b5894be16394648cfc\") " pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.369572 kubelet[2782]: W0325 02:00:23.367527 2782 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:00:23.370218 kubelet[2782]: W0325 02:00:23.370193 2782 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:00:23.370903 kubelet[2782]: E0325 02:00:23.370867 2782 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-srv-u0apo.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.379940 kubelet[2782]: I0325 02:00:23.379879 2782 kubelet_node_status.go:125] "Node was previously registered" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.381263 kubelet[2782]: I0325 02:00:23.380624 2782 kubelet_node_status.go:79] "Successfully registered node" node="srv-u0apo.gb1.brightbox.com" Mar 25 02:00:23.839251 kubelet[2782]: I0325 02:00:23.839054 2782 apiserver.go:52] "Watching apiserver" Mar 25 02:00:23.924740 kubelet[2782]: I0325 02:00:23.924675 2782 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 02:00:24.029584 kubelet[2782]: I0325 02:00:24.026827 2782 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:24.029584 kubelet[2782]: I0325 02:00:24.029185 2782 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:24.056525 kubelet[2782]: W0325 02:00:24.056463 2782 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:00:24.056812 kubelet[2782]: E0325 02:00:24.056602 2782 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-srv-u0apo.gb1.brightbox.com\" already exists" pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:24.083285 kubelet[2782]: W0325 02:00:24.083224 2782 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 25 02:00:24.083524 kubelet[2782]: E0325 02:00:24.083336 2782 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-srv-u0apo.gb1.brightbox.com\" already exists" pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" Mar 25 02:00:24.328361 kubelet[2782]: I0325 02:00:24.328196 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-u0apo.gb1.brightbox.com" podStartSLOduration=1.3280989810000001 podStartE2EDuration="1.328098981s" podCreationTimestamp="2025-03-25 02:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:00:24.219593032 +0000 UTC m=+1.584560289" watchObservedRunningTime="2025-03-25 02:00:24.328098981 +0000 UTC m=+1.693066232" Mar 25 02:00:24.402163 kubelet[2782]: I0325 02:00:24.400756 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-u0apo.gb1.brightbox.com" podStartSLOduration=1.400694339 podStartE2EDuration="1.400694339s" podCreationTimestamp="2025-03-25 02:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:00:24.338252394 +0000 UTC m=+1.703219649" watchObservedRunningTime="2025-03-25 02:00:24.400694339 +0000 UTC m=+1.765661597" Mar 25 02:00:24.473807 kubelet[2782]: I0325 02:00:24.473025 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-u0apo.gb1.brightbox.com" podStartSLOduration=4.473000757 podStartE2EDuration="4.473000757s" podCreationTimestamp="2025-03-25 02:00:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:00:24.404892996 +0000 UTC m=+1.769860260" watchObservedRunningTime="2025-03-25 02:00:24.473000757 +0000 UTC m=+1.837968021" Mar 25 02:00:25.868938 kubelet[2782]: I0325 02:00:25.868316 2782 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 02:00:25.873258 containerd[1523]: time="2025-03-25T02:00:25.871099966Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 02:00:25.876162 kubelet[2782]: I0325 02:00:25.871632 2782 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 02:00:26.033072 systemd[1]: Created slice kubepods-besteffort-pod8d47bb44_d19e_4867_86b0_3320e3795239.slice - libcontainer container kubepods-besteffort-pod8d47bb44_d19e_4867_86b0_3320e3795239.slice. Mar 25 02:00:26.059799 kubelet[2782]: I0325 02:00:26.059512 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8d47bb44-d19e-4867-86b0-3320e3795239-kube-proxy\") pod \"kube-proxy-8lkwr\" (UID: \"8d47bb44-d19e-4867-86b0-3320e3795239\") " pod="kube-system/kube-proxy-8lkwr" Mar 25 02:00:26.059799 kubelet[2782]: I0325 02:00:26.059615 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d47bb44-d19e-4867-86b0-3320e3795239-lib-modules\") pod \"kube-proxy-8lkwr\" (UID: \"8d47bb44-d19e-4867-86b0-3320e3795239\") " pod="kube-system/kube-proxy-8lkwr" Mar 25 02:00:26.059799 kubelet[2782]: I0325 02:00:26.059649 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8vk\" (UniqueName: \"kubernetes.io/projected/8d47bb44-d19e-4867-86b0-3320e3795239-kube-api-access-4k8vk\") pod \"kube-proxy-8lkwr\" (UID: \"8d47bb44-d19e-4867-86b0-3320e3795239\") " pod="kube-system/kube-proxy-8lkwr" Mar 25 02:00:26.059799 kubelet[2782]: I0325 02:00:26.059728 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8d47bb44-d19e-4867-86b0-3320e3795239-xtables-lock\") pod \"kube-proxy-8lkwr\" (UID: \"8d47bb44-d19e-4867-86b0-3320e3795239\") " pod="kube-system/kube-proxy-8lkwr" Mar 25 02:00:26.349060 containerd[1523]: time="2025-03-25T02:00:26.348830918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8lkwr,Uid:8d47bb44-d19e-4867-86b0-3320e3795239,Namespace:kube-system,Attempt:0,}" Mar 25 02:00:26.389887 containerd[1523]: time="2025-03-25T02:00:26.388684832Z" level=info msg="connecting to shim 86a0af2393694345c3698fe17ef2ccdd509988331f26acb4ca64985fab0eef3f" address="unix:///run/containerd/s/54c9f54200a7a72ca240b2f9dfd9269a55137af1bc3aa8450b60cceafb56c545" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:26.465778 systemd[1]: Started cri-containerd-86a0af2393694345c3698fe17ef2ccdd509988331f26acb4ca64985fab0eef3f.scope - libcontainer container 86a0af2393694345c3698fe17ef2ccdd509988331f26acb4ca64985fab0eef3f. Mar 25 02:00:26.678212 containerd[1523]: time="2025-03-25T02:00:26.676514334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8lkwr,Uid:8d47bb44-d19e-4867-86b0-3320e3795239,Namespace:kube-system,Attempt:0,} returns sandbox id \"86a0af2393694345c3698fe17ef2ccdd509988331f26acb4ca64985fab0eef3f\"" Mar 25 02:00:26.687993 containerd[1523]: time="2025-03-25T02:00:26.687921805Z" level=info msg="CreateContainer within sandbox \"86a0af2393694345c3698fe17ef2ccdd509988331f26acb4ca64985fab0eef3f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 02:00:26.711690 containerd[1523]: time="2025-03-25T02:00:26.711632931Z" level=info msg="Container 5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:26.722878 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3311575355.mount: Deactivated successfully. Mar 25 02:00:26.747962 containerd[1523]: time="2025-03-25T02:00:26.744278457Z" level=info msg="CreateContainer within sandbox \"86a0af2393694345c3698fe17ef2ccdd509988331f26acb4ca64985fab0eef3f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee\"" Mar 25 02:00:26.748157 containerd[1523]: time="2025-03-25T02:00:26.748018998Z" level=info msg="StartContainer for \"5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee\"" Mar 25 02:00:26.753459 containerd[1523]: time="2025-03-25T02:00:26.753416061Z" level=info msg="connecting to shim 5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee" address="unix:///run/containerd/s/54c9f54200a7a72ca240b2f9dfd9269a55137af1bc3aa8450b60cceafb56c545" protocol=ttrpc version=3 Mar 25 02:00:26.800443 systemd[1]: Started cri-containerd-5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee.scope - libcontainer container 5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee. Mar 25 02:00:27.040526 systemd[1]: Created slice kubepods-besteffort-poda5e77da1_5c2c_4b2e_99f6_f9dca5351458.slice - libcontainer container kubepods-besteffort-poda5e77da1_5c2c_4b2e_99f6_f9dca5351458.slice. Mar 25 02:00:27.055244 containerd[1523]: time="2025-03-25T02:00:27.055085982Z" level=info msg="StartContainer for \"5c5ffde43347b8ebe27c5ce6daa8c5170a20b11a82a78bc4e83f466aaad9e1ee\" returns successfully" Mar 25 02:00:27.069046 kubelet[2782]: I0325 02:00:27.066423 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjpd\" (UniqueName: \"kubernetes.io/projected/a5e77da1-5c2c-4b2e-99f6-f9dca5351458-kube-api-access-dkjpd\") pod \"tigera-operator-ccfc44587-qd769\" (UID: \"a5e77da1-5c2c-4b2e-99f6-f9dca5351458\") " pod="tigera-operator/tigera-operator-ccfc44587-qd769" Mar 25 02:00:27.069046 kubelet[2782]: I0325 02:00:27.066513 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a5e77da1-5c2c-4b2e-99f6-f9dca5351458-var-lib-calico\") pod \"tigera-operator-ccfc44587-qd769\" (UID: \"a5e77da1-5c2c-4b2e-99f6-f9dca5351458\") " pod="tigera-operator/tigera-operator-ccfc44587-qd769" Mar 25 02:00:27.357674 containerd[1523]: time="2025-03-25T02:00:27.357199741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-qd769,Uid:a5e77da1-5c2c-4b2e-99f6-f9dca5351458,Namespace:tigera-operator,Attempt:0,}" Mar 25 02:00:27.404814 containerd[1523]: time="2025-03-25T02:00:27.403710097Z" level=info msg="connecting to shim eeecede7b27c40dccd4d8db81df5741b319eea44d40081f6e20839b09fcda61c" address="unix:///run/containerd/s/e3fb1f6f4bb45c072ad264fa5f1573dbe098130347bf9766cddcc0f245a85487" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:27.454730 systemd[1]: Started cri-containerd-eeecede7b27c40dccd4d8db81df5741b319eea44d40081f6e20839b09fcda61c.scope - libcontainer container eeecede7b27c40dccd4d8db81df5741b319eea44d40081f6e20839b09fcda61c. Mar 25 02:00:27.592280 containerd[1523]: time="2025-03-25T02:00:27.591707837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-qd769,Uid:a5e77da1-5c2c-4b2e-99f6-f9dca5351458,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"eeecede7b27c40dccd4d8db81df5741b319eea44d40081f6e20839b09fcda61c\"" Mar 25 02:00:27.600379 containerd[1523]: time="2025-03-25T02:00:27.598942398Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 02:00:30.188892 sudo[1810]: pam_unix(sudo:session): session closed for user root Mar 25 02:00:30.284738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount716799782.mount: Deactivated successfully. Mar 25 02:00:30.338992 sshd[1809]: Connection closed by 139.178.68.195 port 58800 Mar 25 02:00:30.340511 sshd-session[1807]: pam_unix(sshd:session): session closed for user core Mar 25 02:00:30.347511 systemd[1]: sshd@8-10.230.42.214:22-139.178.68.195:58800.service: Deactivated successfully. Mar 25 02:00:30.356483 systemd-logind[1502]: Session 11 logged out. Waiting for processes to exit. Mar 25 02:00:30.358326 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 02:00:30.358848 systemd[1]: session-11.scope: Consumed 8.561s CPU time, 144.9M memory peak. Mar 25 02:00:30.365788 systemd-logind[1502]: Removed session 11. Mar 25 02:00:31.180500 containerd[1523]: time="2025-03-25T02:00:31.180432054Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:31.182424 containerd[1523]: time="2025-03-25T02:00:31.182349826Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 25 02:00:31.184100 containerd[1523]: time="2025-03-25T02:00:31.184035757Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:31.186500 containerd[1523]: time="2025-03-25T02:00:31.186442157Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:31.188338 containerd[1523]: time="2025-03-25T02:00:31.187649020Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 3.588603982s" Mar 25 02:00:31.188338 containerd[1523]: time="2025-03-25T02:00:31.187699713Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 25 02:00:31.191369 containerd[1523]: time="2025-03-25T02:00:31.191325360Z" level=info msg="CreateContainer within sandbox \"eeecede7b27c40dccd4d8db81df5741b319eea44d40081f6e20839b09fcda61c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 02:00:31.208195 containerd[1523]: time="2025-03-25T02:00:31.208148992Z" level=info msg="Container 5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:31.219905 containerd[1523]: time="2025-03-25T02:00:31.219852859Z" level=info msg="CreateContainer within sandbox \"eeecede7b27c40dccd4d8db81df5741b319eea44d40081f6e20839b09fcda61c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5\"" Mar 25 02:00:31.221031 containerd[1523]: time="2025-03-25T02:00:31.220634874Z" level=info msg="StartContainer for \"5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5\"" Mar 25 02:00:31.222920 containerd[1523]: time="2025-03-25T02:00:31.222790532Z" level=info msg="connecting to shim 5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5" address="unix:///run/containerd/s/e3fb1f6f4bb45c072ad264fa5f1573dbe098130347bf9766cddcc0f245a85487" protocol=ttrpc version=3 Mar 25 02:00:31.258827 systemd[1]: Started cri-containerd-5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5.scope - libcontainer container 5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5. Mar 25 02:00:31.305845 containerd[1523]: time="2025-03-25T02:00:31.305381550Z" level=info msg="StartContainer for \"5fa640d85e0d24cfae59e88b8bf3ee2b866ef2787d63776602aba43df5ab74c5\" returns successfully" Mar 25 02:00:32.098941 kubelet[2782]: I0325 02:00:32.098747 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8lkwr" podStartSLOduration=7.098673157 podStartE2EDuration="7.098673157s" podCreationTimestamp="2025-03-25 02:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:00:28.090019428 +0000 UTC m=+5.454986693" watchObservedRunningTime="2025-03-25 02:00:32.098673157 +0000 UTC m=+9.463640422" Mar 25 02:00:32.586171 kubelet[2782]: I0325 02:00:32.585814 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-ccfc44587-qd769" podStartSLOduration=2.994257047 podStartE2EDuration="6.5857887s" podCreationTimestamp="2025-03-25 02:00:26 +0000 UTC" firstStartedPulling="2025-03-25 02:00:27.597633653 +0000 UTC m=+4.962600904" lastFinishedPulling="2025-03-25 02:00:31.189165312 +0000 UTC m=+8.554132557" observedRunningTime="2025-03-25 02:00:32.100751076 +0000 UTC m=+9.465718348" watchObservedRunningTime="2025-03-25 02:00:32.5857887 +0000 UTC m=+9.950755951" Mar 25 02:00:34.714140 systemd[1]: Created slice kubepods-besteffort-podd8604123_c470_4c9c_bc8f_3853a64c4faf.slice - libcontainer container kubepods-besteffort-podd8604123_c470_4c9c_bc8f_3853a64c4faf.slice. Mar 25 02:00:34.718595 kubelet[2782]: I0325 02:00:34.718068 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qxc5\" (UniqueName: \"kubernetes.io/projected/d8604123-c470-4c9c-bc8f-3853a64c4faf-kube-api-access-8qxc5\") pod \"calico-typha-64cb86b945-b4zbm\" (UID: \"d8604123-c470-4c9c-bc8f-3853a64c4faf\") " pod="calico-system/calico-typha-64cb86b945-b4zbm" Mar 25 02:00:34.718595 kubelet[2782]: I0325 02:00:34.718165 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8604123-c470-4c9c-bc8f-3853a64c4faf-tigera-ca-bundle\") pod \"calico-typha-64cb86b945-b4zbm\" (UID: \"d8604123-c470-4c9c-bc8f-3853a64c4faf\") " pod="calico-system/calico-typha-64cb86b945-b4zbm" Mar 25 02:00:34.718595 kubelet[2782]: I0325 02:00:34.718195 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d8604123-c470-4c9c-bc8f-3853a64c4faf-typha-certs\") pod \"calico-typha-64cb86b945-b4zbm\" (UID: \"d8604123-c470-4c9c-bc8f-3853a64c4faf\") " pod="calico-system/calico-typha-64cb86b945-b4zbm" Mar 25 02:00:34.880864 systemd[1]: Created slice kubepods-besteffort-podaea98d1b_d4bf_4e73_9205_d458c40ca1c2.slice - libcontainer container kubepods-besteffort-podaea98d1b_d4bf_4e73_9205_d458c40ca1c2.slice. Mar 25 02:00:34.920098 kubelet[2782]: I0325 02:00:34.920021 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-node-certs\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.920098 kubelet[2782]: I0325 02:00:34.920097 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-tigera-ca-bundle\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.920360 kubelet[2782]: I0325 02:00:34.920134 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvg5\" (UniqueName: \"kubernetes.io/projected/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-kube-api-access-pgvg5\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.920360 kubelet[2782]: I0325 02:00:34.920185 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-run-calico\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.920360 kubelet[2782]: I0325 02:00:34.920227 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-lib-modules\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.920360 kubelet[2782]: I0325 02:00:34.920258 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-xtables-lock\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.920360 kubelet[2782]: I0325 02:00:34.920285 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-log-dir\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.921162 kubelet[2782]: I0325 02:00:34.920328 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-lib-calico\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.921162 kubelet[2782]: I0325 02:00:34.920371 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-bin-dir\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.921162 kubelet[2782]: I0325 02:00:34.920403 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-net-dir\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.921162 kubelet[2782]: I0325 02:00:34.920431 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-policysync\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.921162 kubelet[2782]: I0325 02:00:34.920469 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-flexvol-driver-host\") pod \"calico-node-rjwwt\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " pod="calico-system/calico-node-rjwwt" Mar 25 02:00:34.997700 kubelet[2782]: E0325 02:00:34.994988 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:35.021570 kubelet[2782]: I0325 02:00:35.021489 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b452acd9-0dc2-4c9b-905e-864aa551f46e-registration-dir\") pod \"csi-node-driver-rfmq7\" (UID: \"b452acd9-0dc2-4c9b-905e-864aa551f46e\") " pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:35.022191 kubelet[2782]: I0325 02:00:35.022151 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtm4w\" (UniqueName: \"kubernetes.io/projected/b452acd9-0dc2-4c9b-905e-864aa551f46e-kube-api-access-vtm4w\") pod \"csi-node-driver-rfmq7\" (UID: \"b452acd9-0dc2-4c9b-905e-864aa551f46e\") " pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:35.022820 kubelet[2782]: I0325 02:00:35.022766 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b452acd9-0dc2-4c9b-905e-864aa551f46e-kubelet-dir\") pod \"csi-node-driver-rfmq7\" (UID: \"b452acd9-0dc2-4c9b-905e-864aa551f46e\") " pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:35.023014 kubelet[2782]: I0325 02:00:35.022988 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b452acd9-0dc2-4c9b-905e-864aa551f46e-socket-dir\") pod \"csi-node-driver-rfmq7\" (UID: \"b452acd9-0dc2-4c9b-905e-864aa551f46e\") " pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:35.024701 kubelet[2782]: E0325 02:00:35.024666 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.025126 kubelet[2782]: W0325 02:00:35.025094 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.025425 kubelet[2782]: E0325 02:00:35.025288 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.026644 kubelet[2782]: I0325 02:00:35.026588 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b452acd9-0dc2-4c9b-905e-864aa551f46e-varrun\") pod \"csi-node-driver-rfmq7\" (UID: \"b452acd9-0dc2-4c9b-905e-864aa551f46e\") " pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:35.027732 kubelet[2782]: E0325 02:00:35.027590 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.027732 kubelet[2782]: W0325 02:00:35.027611 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.027732 kubelet[2782]: E0325 02:00:35.027629 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.029162 kubelet[2782]: E0325 02:00:35.029020 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.029162 kubelet[2782]: W0325 02:00:35.029040 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.031513 containerd[1523]: time="2025-03-25T02:00:35.030346960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64cb86b945-b4zbm,Uid:d8604123-c470-4c9c-bc8f-3853a64c4faf,Namespace:calico-system,Attempt:0,}" Mar 25 02:00:35.034262 kubelet[2782]: E0325 02:00:35.029056 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.034262 kubelet[2782]: E0325 02:00:35.031393 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.034262 kubelet[2782]: W0325 02:00:35.031475 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.043136 kubelet[2782]: E0325 02:00:35.035960 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.045805 kubelet[2782]: E0325 02:00:35.045744 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.047038 kubelet[2782]: W0325 02:00:35.046011 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.047038 kubelet[2782]: E0325 02:00:35.046059 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.047738 kubelet[2782]: E0325 02:00:35.047702 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.048172 kubelet[2782]: W0325 02:00:35.047948 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.048448 kubelet[2782]: E0325 02:00:35.048406 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.049659 kubelet[2782]: E0325 02:00:35.048815 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.049860 kubelet[2782]: W0325 02:00:35.048834 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.050478 kubelet[2782]: E0325 02:00:35.050309 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.051735 kubelet[2782]: E0325 02:00:35.051631 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.052209 kubelet[2782]: W0325 02:00:35.051927 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.053773 kubelet[2782]: E0325 02:00:35.053637 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.053773 kubelet[2782]: W0325 02:00:35.053658 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.053773 kubelet[2782]: E0325 02:00:35.053703 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.053773 kubelet[2782]: E0325 02:00:35.053729 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.055131 kubelet[2782]: E0325 02:00:35.054643 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.055131 kubelet[2782]: W0325 02:00:35.054662 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.055742 kubelet[2782]: E0325 02:00:35.055108 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.055742 kubelet[2782]: W0325 02:00:35.055685 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.055985 kubelet[2782]: E0325 02:00:35.055273 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.055985 kubelet[2782]: E0325 02:00:35.055931 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.058790 kubelet[2782]: E0325 02:00:35.058623 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.058790 kubelet[2782]: W0325 02:00:35.058644 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.059328 kubelet[2782]: E0325 02:00:35.059048 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.059328 kubelet[2782]: E0325 02:00:35.059167 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.059328 kubelet[2782]: W0325 02:00:35.059186 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.061783 kubelet[2782]: E0325 02:00:35.061464 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.063581 kubelet[2782]: E0325 02:00:35.062602 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.063581 kubelet[2782]: W0325 02:00:35.062622 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.064218 kubelet[2782]: E0325 02:00:35.063849 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.064776 kubelet[2782]: E0325 02:00:35.064755 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.065916 kubelet[2782]: W0325 02:00:35.065750 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.067344 kubelet[2782]: E0325 02:00:35.067317 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.067650 kubelet[2782]: E0325 02:00:35.067529 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.068192 kubelet[2782]: W0325 02:00:35.068023 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.068341 kubelet[2782]: E0325 02:00:35.068319 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.068794 kubelet[2782]: E0325 02:00:35.068751 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.068794 kubelet[2782]: W0325 02:00:35.068771 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.071994 kubelet[2782]: E0325 02:00:35.068964 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.087531 kubelet[2782]: E0325 02:00:35.087007 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.089196 kubelet[2782]: W0325 02:00:35.089129 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.089419 kubelet[2782]: E0325 02:00:35.089395 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.093137 kubelet[2782]: E0325 02:00:35.092837 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.094500 kubelet[2782]: W0325 02:00:35.094389 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.094500 kubelet[2782]: E0325 02:00:35.094460 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.098128 kubelet[2782]: E0325 02:00:35.097967 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.098128 kubelet[2782]: W0325 02:00:35.097996 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.098128 kubelet[2782]: E0325 02:00:35.098015 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.108775 kubelet[2782]: E0325 02:00:35.108747 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.108961 kubelet[2782]: W0325 02:00:35.108923 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.109118 kubelet[2782]: E0325 02:00:35.109091 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.109586 kubelet[2782]: E0325 02:00:35.109566 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.109822 kubelet[2782]: W0325 02:00:35.109674 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.109822 kubelet[2782]: E0325 02:00:35.109705 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.112298 kubelet[2782]: E0325 02:00:35.112060 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.112298 kubelet[2782]: W0325 02:00:35.112080 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.112298 kubelet[2782]: E0325 02:00:35.112103 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.113891 kubelet[2782]: E0325 02:00:35.113871 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.116125 kubelet[2782]: W0325 02:00:35.115922 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.116125 kubelet[2782]: E0325 02:00:35.115972 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.117965 kubelet[2782]: E0325 02:00:35.117729 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.118278 kubelet[2782]: W0325 02:00:35.118087 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.118278 kubelet[2782]: E0325 02:00:35.118117 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.119910 kubelet[2782]: E0325 02:00:35.119727 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.119910 kubelet[2782]: W0325 02:00:35.119747 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.119910 kubelet[2782]: E0325 02:00:35.119764 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.121831 kubelet[2782]: E0325 02:00:35.121810 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.122128 kubelet[2782]: W0325 02:00:35.121969 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.122128 kubelet[2782]: E0325 02:00:35.121998 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.124799 kubelet[2782]: E0325 02:00:35.124605 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.124799 kubelet[2782]: W0325 02:00:35.124627 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.124799 kubelet[2782]: E0325 02:00:35.124644 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.127601 kubelet[2782]: E0325 02:00:35.127313 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.127601 kubelet[2782]: W0325 02:00:35.127337 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.127601 kubelet[2782]: E0325 02:00:35.127361 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.128489 kubelet[2782]: E0325 02:00:35.128340 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.128489 kubelet[2782]: W0325 02:00:35.128361 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.128489 kubelet[2782]: E0325 02:00:35.128378 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.129508 kubelet[2782]: E0325 02:00:35.129406 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.129508 kubelet[2782]: W0325 02:00:35.129425 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.129508 kubelet[2782]: E0325 02:00:35.129442 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.130336 kubelet[2782]: E0325 02:00:35.130083 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.130336 kubelet[2782]: W0325 02:00:35.130102 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.130336 kubelet[2782]: E0325 02:00:35.130117 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.131005 kubelet[2782]: E0325 02:00:35.130827 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.131005 kubelet[2782]: W0325 02:00:35.130851 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.131005 kubelet[2782]: E0325 02:00:35.130867 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.131560 kubelet[2782]: E0325 02:00:35.131367 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.131560 kubelet[2782]: W0325 02:00:35.131385 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.131560 kubelet[2782]: E0325 02:00:35.131400 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.132209 kubelet[2782]: E0325 02:00:35.132118 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.132209 kubelet[2782]: W0325 02:00:35.132137 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.132209 kubelet[2782]: E0325 02:00:35.132153 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.147455 kubelet[2782]: E0325 02:00:35.147382 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.147981 kubelet[2782]: W0325 02:00:35.147414 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.147981 kubelet[2782]: E0325 02:00:35.147646 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.149253 kubelet[2782]: E0325 02:00:35.148836 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.149253 kubelet[2782]: W0325 02:00:35.149005 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.149253 kubelet[2782]: E0325 02:00:35.149023 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.150129 kubelet[2782]: E0325 02:00:35.149808 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.150129 kubelet[2782]: W0325 02:00:35.150015 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.151162 kubelet[2782]: E0325 02:00:35.150060 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.151452 kubelet[2782]: E0325 02:00:35.150890 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.151452 kubelet[2782]: W0325 02:00:35.151306 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.151452 kubelet[2782]: E0325 02:00:35.151325 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.152987 kubelet[2782]: E0325 02:00:35.152759 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.152987 kubelet[2782]: W0325 02:00:35.152778 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.153394 kubelet[2782]: E0325 02:00:35.152795 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.156566 kubelet[2782]: E0325 02:00:35.156359 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.156566 kubelet[2782]: W0325 02:00:35.156379 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.156566 kubelet[2782]: E0325 02:00:35.156436 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.157737 kubelet[2782]: E0325 02:00:35.157441 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.157737 kubelet[2782]: W0325 02:00:35.157531 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.157737 kubelet[2782]: E0325 02:00:35.157610 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.158810 containerd[1523]: time="2025-03-25T02:00:35.158406681Z" level=info msg="connecting to shim 5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493" address="unix:///run/containerd/s/79d1b11f101795e85fcc64126d3b9a07925399dc006da665bfc739d7e22925a1" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:35.159118 kubelet[2782]: E0325 02:00:35.158558 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.159118 kubelet[2782]: W0325 02:00:35.158575 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.159118 kubelet[2782]: E0325 02:00:35.158625 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.161371 kubelet[2782]: E0325 02:00:35.160819 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.161371 kubelet[2782]: W0325 02:00:35.160839 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.161371 kubelet[2782]: E0325 02:00:35.160861 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.164014 kubelet[2782]: E0325 02:00:35.163382 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.164014 kubelet[2782]: W0325 02:00:35.163438 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.166587 kubelet[2782]: E0325 02:00:35.166218 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.166587 kubelet[2782]: W0325 02:00:35.166239 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.167039 kubelet[2782]: E0325 02:00:35.166975 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.167920 kubelet[2782]: W0325 02:00:35.167663 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.171755 kubelet[2782]: E0325 02:00:35.171622 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.172853 kubelet[2782]: E0325 02:00:35.172012 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.172853 kubelet[2782]: E0325 02:00:35.172041 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.172853 kubelet[2782]: E0325 02:00:35.172091 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.172853 kubelet[2782]: W0325 02:00:35.172105 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.172853 kubelet[2782]: E0325 02:00:35.172722 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.172853 kubelet[2782]: W0325 02:00:35.172737 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.173927 kubelet[2782]: E0325 02:00:35.173907 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.174783 kubelet[2782]: W0325 02:00:35.174096 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.174959 kubelet[2782]: E0325 02:00:35.174910 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.175177 kubelet[2782]: W0325 02:00:35.175155 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.175406 kubelet[2782]: E0325 02:00:35.175381 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.182149 kubelet[2782]: E0325 02:00:35.181931 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.182149 kubelet[2782]: E0325 02:00:35.182057 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.182149 kubelet[2782]: E0325 02:00:35.182082 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.183964 kubelet[2782]: E0325 02:00:35.183918 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.185568 kubelet[2782]: W0325 02:00:35.184632 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.185568 kubelet[2782]: E0325 02:00:35.184669 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.187837 kubelet[2782]: E0325 02:00:35.187403 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.187837 kubelet[2782]: W0325 02:00:35.187425 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.187837 kubelet[2782]: E0325 02:00:35.187491 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.188517 kubelet[2782]: E0325 02:00:35.188473 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.188762 kubelet[2782]: W0325 02:00:35.188693 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.189308 kubelet[2782]: E0325 02:00:35.188996 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.189669 kubelet[2782]: E0325 02:00:35.189648 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.189838 kubelet[2782]: W0325 02:00:35.189817 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.191782 kubelet[2782]: E0325 02:00:35.191658 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.192076 kubelet[2782]: E0325 02:00:35.192043 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.192360 kubelet[2782]: W0325 02:00:35.192309 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.194485 kubelet[2782]: E0325 02:00:35.193654 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.194485 kubelet[2782]: W0325 02:00:35.193674 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.194485 kubelet[2782]: E0325 02:00:35.194089 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.194485 kubelet[2782]: E0325 02:00:35.194118 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.194721 containerd[1523]: time="2025-03-25T02:00:35.194519509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rjwwt,Uid:aea98d1b-d4bf-4e73-9205-d458c40ca1c2,Namespace:calico-system,Attempt:0,}" Mar 25 02:00:35.195422 kubelet[2782]: E0325 02:00:35.195315 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.195897 kubelet[2782]: W0325 02:00:35.195872 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.196486 kubelet[2782]: E0325 02:00:35.196463 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.198057 kubelet[2782]: E0325 02:00:35.197557 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.198057 kubelet[2782]: W0325 02:00:35.197803 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.198057 kubelet[2782]: E0325 02:00:35.198001 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.199149 kubelet[2782]: E0325 02:00:35.198836 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.199149 kubelet[2782]: W0325 02:00:35.198854 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.199149 kubelet[2782]: E0325 02:00:35.198871 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.200565 kubelet[2782]: E0325 02:00:35.200375 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.200565 kubelet[2782]: W0325 02:00:35.200421 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.200565 kubelet[2782]: E0325 02:00:35.200441 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.222615 kubelet[2782]: E0325 02:00:35.222433 2782 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 02:00:35.222615 kubelet[2782]: W0325 02:00:35.222474 2782 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 02:00:35.222615 kubelet[2782]: E0325 02:00:35.222529 2782 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 02:00:35.252176 systemd[1]: Started cri-containerd-5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493.scope - libcontainer container 5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493. Mar 25 02:00:35.272855 containerd[1523]: time="2025-03-25T02:00:35.272800055Z" level=info msg="connecting to shim f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd" address="unix:///run/containerd/s/eb9306af8a4d5af306328d3a263c4ef8b2ba566115fb73a562edee2ed0576f83" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:00:35.327889 systemd[1]: Started cri-containerd-f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd.scope - libcontainer container f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd. Mar 25 02:00:35.394101 containerd[1523]: time="2025-03-25T02:00:35.394046780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-rjwwt,Uid:aea98d1b-d4bf-4e73-9205-d458c40ca1c2,Namespace:calico-system,Attempt:0,} returns sandbox id \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\"" Mar 25 02:00:35.399570 containerd[1523]: time="2025-03-25T02:00:35.399382043Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 02:00:35.476029 containerd[1523]: time="2025-03-25T02:00:35.475200143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-64cb86b945-b4zbm,Uid:d8604123-c470-4c9c-bc8f-3853a64c4faf,Namespace:calico-system,Attempt:0,} returns sandbox id \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\"" Mar 25 02:00:36.973082 kubelet[2782]: E0325 02:00:36.973007 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:37.474202 containerd[1523]: time="2025-03-25T02:00:37.474144516Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:37.476507 containerd[1523]: time="2025-03-25T02:00:37.476385084Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 25 02:00:37.477683 containerd[1523]: time="2025-03-25T02:00:37.477605151Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:37.481063 containerd[1523]: time="2025-03-25T02:00:37.480983345Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:37.482571 containerd[1523]: time="2025-03-25T02:00:37.482165124Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 2.082710441s" Mar 25 02:00:37.482571 containerd[1523]: time="2025-03-25T02:00:37.482216117Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 25 02:00:37.485575 containerd[1523]: time="2025-03-25T02:00:37.485407742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 02:00:37.490507 containerd[1523]: time="2025-03-25T02:00:37.490331966Z" level=info msg="CreateContainer within sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 02:00:37.508806 containerd[1523]: time="2025-03-25T02:00:37.508747271Z" level=info msg="Container dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:37.524023 containerd[1523]: time="2025-03-25T02:00:37.523837296Z" level=info msg="CreateContainer within sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\"" Mar 25 02:00:37.527635 containerd[1523]: time="2025-03-25T02:00:37.525292459Z" level=info msg="StartContainer for \"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\"" Mar 25 02:00:37.527635 containerd[1523]: time="2025-03-25T02:00:37.527286800Z" level=info msg="connecting to shim dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1" address="unix:///run/containerd/s/eb9306af8a4d5af306328d3a263c4ef8b2ba566115fb73a562edee2ed0576f83" protocol=ttrpc version=3 Mar 25 02:00:37.583960 systemd[1]: Started cri-containerd-dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1.scope - libcontainer container dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1. Mar 25 02:00:37.676749 containerd[1523]: time="2025-03-25T02:00:37.676699092Z" level=info msg="StartContainer for \"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" returns successfully" Mar 25 02:00:37.713387 systemd[1]: cri-containerd-dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1.scope: Deactivated successfully. Mar 25 02:00:37.734054 containerd[1523]: time="2025-03-25T02:00:37.733000582Z" level=info msg="received exit event container_id:\"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" id:\"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" pid:3340 exited_at:{seconds:1742868037 nanos:717696674}" Mar 25 02:00:37.734054 containerd[1523]: time="2025-03-25T02:00:37.733426425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" id:\"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" pid:3340 exited_at:{seconds:1742868037 nanos:717696674}" Mar 25 02:00:37.777329 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1-rootfs.mount: Deactivated successfully. Mar 25 02:00:38.973704 kubelet[2782]: E0325 02:00:38.973468 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:40.939600 containerd[1523]: time="2025-03-25T02:00:40.938376648Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:40.949220 containerd[1523]: time="2025-03-25T02:00:40.949024651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 25 02:00:40.955975 containerd[1523]: time="2025-03-25T02:00:40.955835172Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:40.959143 containerd[1523]: time="2025-03-25T02:00:40.959084201Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:40.960393 containerd[1523]: time="2025-03-25T02:00:40.960310669Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.474623566s" Mar 25 02:00:40.960393 containerd[1523]: time="2025-03-25T02:00:40.960386369Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 25 02:00:40.968936 containerd[1523]: time="2025-03-25T02:00:40.968762963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 02:00:40.979192 kubelet[2782]: E0325 02:00:40.978518 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:41.033592 containerd[1523]: time="2025-03-25T02:00:41.033441791Z" level=info msg="CreateContainer within sandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 02:00:41.048111 containerd[1523]: time="2025-03-25T02:00:41.047966751Z" level=info msg="Container eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:41.056801 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4108317748.mount: Deactivated successfully. Mar 25 02:00:41.070350 containerd[1523]: time="2025-03-25T02:00:41.070276711Z" level=info msg="CreateContainer within sandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\"" Mar 25 02:00:41.074730 containerd[1523]: time="2025-03-25T02:00:41.074673222Z" level=info msg="StartContainer for \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\"" Mar 25 02:00:41.079953 containerd[1523]: time="2025-03-25T02:00:41.077211654Z" level=info msg="connecting to shim eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82" address="unix:///run/containerd/s/79d1b11f101795e85fcc64126d3b9a07925399dc006da665bfc739d7e22925a1" protocol=ttrpc version=3 Mar 25 02:00:41.127161 systemd[1]: Started cri-containerd-eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82.scope - libcontainer container eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82. Mar 25 02:00:41.277422 containerd[1523]: time="2025-03-25T02:00:41.277217379Z" level=info msg="StartContainer for \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" returns successfully" Mar 25 02:00:42.974601 kubelet[2782]: E0325 02:00:42.973474 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:43.166574 kubelet[2782]: I0325 02:00:43.166172 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:00:44.976479 kubelet[2782]: E0325 02:00:44.973254 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:46.984804 kubelet[2782]: E0325 02:00:46.983941 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:48.977207 kubelet[2782]: E0325 02:00:48.974452 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:49.606604 kubelet[2782]: I0325 02:00:49.605936 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:00:49.661638 kubelet[2782]: I0325 02:00:49.661396 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-64cb86b945-b4zbm" podStartSLOduration=10.172261885 podStartE2EDuration="15.66126524s" podCreationTimestamp="2025-03-25 02:00:34 +0000 UTC" firstStartedPulling="2025-03-25 02:00:35.479139166 +0000 UTC m=+12.844106417" lastFinishedPulling="2025-03-25 02:00:40.968142508 +0000 UTC m=+18.333109772" observedRunningTime="2025-03-25 02:00:42.20868431 +0000 UTC m=+19.573651562" watchObservedRunningTime="2025-03-25 02:00:49.66126524 +0000 UTC m=+27.026232491" Mar 25 02:00:50.974812 kubelet[2782]: E0325 02:00:50.973604 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:52.991047 kubelet[2782]: E0325 02:00:52.990698 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:53.238245 containerd[1523]: time="2025-03-25T02:00:53.236617587Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:53.240970 containerd[1523]: time="2025-03-25T02:00:53.239071243Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 25 02:00:53.240970 containerd[1523]: time="2025-03-25T02:00:53.239126584Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:53.256358 containerd[1523]: time="2025-03-25T02:00:53.256187992Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:00:53.258864 containerd[1523]: time="2025-03-25T02:00:53.258831902Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 12.289981329s" Mar 25 02:00:53.259014 containerd[1523]: time="2025-03-25T02:00:53.258987420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 25 02:00:53.268439 containerd[1523]: time="2025-03-25T02:00:53.268372083Z" level=info msg="CreateContainer within sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 02:00:53.291088 containerd[1523]: time="2025-03-25T02:00:53.291026856Z" level=info msg="Container a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:00:53.303270 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount420665629.mount: Deactivated successfully. Mar 25 02:00:53.319120 containerd[1523]: time="2025-03-25T02:00:53.319065545Z" level=info msg="CreateContainer within sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\"" Mar 25 02:00:53.321973 containerd[1523]: time="2025-03-25T02:00:53.321904504Z" level=info msg="StartContainer for \"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\"" Mar 25 02:00:53.325746 containerd[1523]: time="2025-03-25T02:00:53.325277338Z" level=info msg="connecting to shim a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e" address="unix:///run/containerd/s/eb9306af8a4d5af306328d3a263c4ef8b2ba566115fb73a562edee2ed0576f83" protocol=ttrpc version=3 Mar 25 02:00:53.411864 systemd[1]: Started cri-containerd-a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e.scope - libcontainer container a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e. Mar 25 02:00:53.499214 containerd[1523]: time="2025-03-25T02:00:53.498930672Z" level=info msg="StartContainer for \"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" returns successfully" Mar 25 02:00:54.597027 systemd[1]: cri-containerd-a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e.scope: Deactivated successfully. Mar 25 02:00:54.598801 systemd[1]: cri-containerd-a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e.scope: Consumed 866ms CPU time, 154.7M memory peak, 3.2M read from disk, 154M written to disk. Mar 25 02:00:54.607556 containerd[1523]: time="2025-03-25T02:00:54.607310367Z" level=info msg="received exit event container_id:\"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" id:\"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" pid:3437 exited_at:{seconds:1742868054 nanos:604835804}" Mar 25 02:00:54.608702 containerd[1523]: time="2025-03-25T02:00:54.608347058Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" id:\"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" pid:3437 exited_at:{seconds:1742868054 nanos:604835804}" Mar 25 02:00:54.661308 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e-rootfs.mount: Deactivated successfully. Mar 25 02:00:54.795148 kubelet[2782]: I0325 02:00:54.795084 2782 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 25 02:00:54.915575 systemd[1]: Created slice kubepods-burstable-podd73724c5_eafb_4ad9_a2d7_760fb2b955a3.slice - libcontainer container kubepods-burstable-podd73724c5_eafb_4ad9_a2d7_760fb2b955a3.slice. Mar 25 02:00:54.931765 systemd[1]: Created slice kubepods-burstable-podd129c3b4_f6a0_4232_9067_911267ef131b.slice - libcontainer container kubepods-burstable-podd129c3b4_f6a0_4232_9067_911267ef131b.slice. Mar 25 02:00:54.946386 systemd[1]: Created slice kubepods-besteffort-podff374ecd_e367_41b4_8d7a_74ac7ff2882d.slice - libcontainer container kubepods-besteffort-podff374ecd_e367_41b4_8d7a_74ac7ff2882d.slice. Mar 25 02:00:54.956733 kubelet[2782]: I0325 02:00:54.956015 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ff374ecd-e367-41b4-8d7a-74ac7ff2882d-calico-apiserver-certs\") pod \"calico-apiserver-7c6dff454b-6s6kn\" (UID: \"ff374ecd-e367-41b4-8d7a-74ac7ff2882d\") " pod="calico-apiserver/calico-apiserver-7c6dff454b-6s6kn" Mar 25 02:00:54.956733 kubelet[2782]: I0325 02:00:54.956087 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcsq\" (UniqueName: \"kubernetes.io/projected/760bc0b6-71e1-4be3-8d8e-ceb103060e98-kube-api-access-qmcsq\") pod \"calico-kube-controllers-5b9d7665cb-7crdx\" (UID: \"760bc0b6-71e1-4be3-8d8e-ceb103060e98\") " pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" Mar 25 02:00:54.956733 kubelet[2782]: I0325 02:00:54.956139 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d129c3b4-f6a0-4232-9067-911267ef131b-config-volume\") pod \"coredns-668d6bf9bc-f58bh\" (UID: \"d129c3b4-f6a0-4232-9067-911267ef131b\") " pod="kube-system/coredns-668d6bf9bc-f58bh" Mar 25 02:00:54.956733 kubelet[2782]: I0325 02:00:54.956168 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9hck\" (UniqueName: \"kubernetes.io/projected/d129c3b4-f6a0-4232-9067-911267ef131b-kube-api-access-p9hck\") pod \"coredns-668d6bf9bc-f58bh\" (UID: \"d129c3b4-f6a0-4232-9067-911267ef131b\") " pod="kube-system/coredns-668d6bf9bc-f58bh" Mar 25 02:00:54.956733 kubelet[2782]: I0325 02:00:54.956206 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g25g\" (UniqueName: \"kubernetes.io/projected/7d76ff3d-f174-4f4c-9d88-240590ad06da-kube-api-access-7g25g\") pod \"calico-apiserver-7c6dff454b-6m8j9\" (UID: \"7d76ff3d-f174-4f4c-9d88-240590ad06da\") " pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" Mar 25 02:00:54.957144 kubelet[2782]: I0325 02:00:54.956271 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7d76ff3d-f174-4f4c-9d88-240590ad06da-calico-apiserver-certs\") pod \"calico-apiserver-7c6dff454b-6m8j9\" (UID: \"7d76ff3d-f174-4f4c-9d88-240590ad06da\") " pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" Mar 25 02:00:54.957144 kubelet[2782]: I0325 02:00:54.956311 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkgxj\" (UniqueName: \"kubernetes.io/projected/ff374ecd-e367-41b4-8d7a-74ac7ff2882d-kube-api-access-dkgxj\") pod \"calico-apiserver-7c6dff454b-6s6kn\" (UID: \"ff374ecd-e367-41b4-8d7a-74ac7ff2882d\") " pod="calico-apiserver/calico-apiserver-7c6dff454b-6s6kn" Mar 25 02:00:54.957144 kubelet[2782]: I0325 02:00:54.956350 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plkzd\" (UniqueName: \"kubernetes.io/projected/d73724c5-eafb-4ad9-a2d7-760fb2b955a3-kube-api-access-plkzd\") pod \"coredns-668d6bf9bc-2zw92\" (UID: \"d73724c5-eafb-4ad9-a2d7-760fb2b955a3\") " pod="kube-system/coredns-668d6bf9bc-2zw92" Mar 25 02:00:54.957144 kubelet[2782]: I0325 02:00:54.956399 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760bc0b6-71e1-4be3-8d8e-ceb103060e98-tigera-ca-bundle\") pod \"calico-kube-controllers-5b9d7665cb-7crdx\" (UID: \"760bc0b6-71e1-4be3-8d8e-ceb103060e98\") " pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" Mar 25 02:00:54.957144 kubelet[2782]: I0325 02:00:54.956447 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d73724c5-eafb-4ad9-a2d7-760fb2b955a3-config-volume\") pod \"coredns-668d6bf9bc-2zw92\" (UID: \"d73724c5-eafb-4ad9-a2d7-760fb2b955a3\") " pod="kube-system/coredns-668d6bf9bc-2zw92" Mar 25 02:00:54.957188 systemd[1]: Created slice kubepods-besteffort-pod760bc0b6_71e1_4be3_8d8e_ceb103060e98.slice - libcontainer container kubepods-besteffort-pod760bc0b6_71e1_4be3_8d8e_ceb103060e98.slice. Mar 25 02:00:54.970404 systemd[1]: Created slice kubepods-besteffort-pod7d76ff3d_f174_4f4c_9d88_240590ad06da.slice - libcontainer container kubepods-besteffort-pod7d76ff3d_f174_4f4c_9d88_240590ad06da.slice. Mar 25 02:00:54.985054 systemd[1]: Created slice kubepods-besteffort-podb452acd9_0dc2_4c9b_905e_864aa551f46e.slice - libcontainer container kubepods-besteffort-podb452acd9_0dc2_4c9b_905e_864aa551f46e.slice. Mar 25 02:00:55.002353 containerd[1523]: time="2025-03-25T02:00:55.000145351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfmq7,Uid:b452acd9-0dc2-4c9b-905e-864aa551f46e,Namespace:calico-system,Attempt:0,}" Mar 25 02:00:55.234276 containerd[1523]: time="2025-03-25T02:00:55.233634400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2zw92,Uid:d73724c5-eafb-4ad9-a2d7-760fb2b955a3,Namespace:kube-system,Attempt:0,}" Mar 25 02:00:55.242018 containerd[1523]: time="2025-03-25T02:00:55.241740410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-f58bh,Uid:d129c3b4-f6a0-4232-9067-911267ef131b,Namespace:kube-system,Attempt:0,}" Mar 25 02:00:55.254204 containerd[1523]: time="2025-03-25T02:00:55.254150740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6s6kn,Uid:ff374ecd-e367-41b4-8d7a-74ac7ff2882d,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:00:55.266708 containerd[1523]: time="2025-03-25T02:00:55.266652826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b9d7665cb-7crdx,Uid:760bc0b6-71e1-4be3-8d8e-ceb103060e98,Namespace:calico-system,Attempt:0,}" Mar 25 02:00:55.276302 containerd[1523]: time="2025-03-25T02:00:55.276242769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6m8j9,Uid:7d76ff3d-f174-4f4c-9d88-240590ad06da,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:00:55.363678 containerd[1523]: time="2025-03-25T02:00:55.363518698Z" level=error msg="Failed to destroy network for sandbox \"fc75d0f4a549187e7116764a39178c4c26c55ce68c2ca56e49fc87104f0e6332\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.439971 containerd[1523]: time="2025-03-25T02:00:55.383679733Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfmq7,Uid:b452acd9-0dc2-4c9b-905e-864aa551f46e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc75d0f4a549187e7116764a39178c4c26c55ce68c2ca56e49fc87104f0e6332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.479825 kubelet[2782]: E0325 02:00:55.471489 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc75d0f4a549187e7116764a39178c4c26c55ce68c2ca56e49fc87104f0e6332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.495784 kubelet[2782]: E0325 02:00:55.492197 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc75d0f4a549187e7116764a39178c4c26c55ce68c2ca56e49fc87104f0e6332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:55.495784 kubelet[2782]: E0325 02:00:55.492290 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc75d0f4a549187e7116764a39178c4c26c55ce68c2ca56e49fc87104f0e6332\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:00:55.495784 kubelet[2782]: E0325 02:00:55.492468 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rfmq7_calico-system(b452acd9-0dc2-4c9b-905e-864aa551f46e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rfmq7_calico-system(b452acd9-0dc2-4c9b-905e-864aa551f46e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fc75d0f4a549187e7116764a39178c4c26c55ce68c2ca56e49fc87104f0e6332\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:00:55.515038 containerd[1523]: time="2025-03-25T02:00:55.514957112Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 02:00:55.605788 containerd[1523]: time="2025-03-25T02:00:55.605651530Z" level=error msg="Failed to destroy network for sandbox \"9f8a7972901d0a3de4c60ced269ae0108343e9327c0c5d26dc7071ba218fe48d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.608917 containerd[1523]: time="2025-03-25T02:00:55.607624494Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2zw92,Uid:d73724c5-eafb-4ad9-a2d7-760fb2b955a3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f8a7972901d0a3de4c60ced269ae0108343e9327c0c5d26dc7071ba218fe48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.609959 kubelet[2782]: E0325 02:00:55.608944 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f8a7972901d0a3de4c60ced269ae0108343e9327c0c5d26dc7071ba218fe48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.609959 kubelet[2782]: E0325 02:00:55.609047 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f8a7972901d0a3de4c60ced269ae0108343e9327c0c5d26dc7071ba218fe48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2zw92" Mar 25 02:00:55.609959 kubelet[2782]: E0325 02:00:55.609095 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f8a7972901d0a3de4c60ced269ae0108343e9327c0c5d26dc7071ba218fe48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2zw92" Mar 25 02:00:55.610143 kubelet[2782]: E0325 02:00:55.609174 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-2zw92_kube-system(d73724c5-eafb-4ad9-a2d7-760fb2b955a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-2zw92_kube-system(d73724c5-eafb-4ad9-a2d7-760fb2b955a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9f8a7972901d0a3de4c60ced269ae0108343e9327c0c5d26dc7071ba218fe48d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-2zw92" podUID="d73724c5-eafb-4ad9-a2d7-760fb2b955a3" Mar 25 02:00:55.621829 containerd[1523]: time="2025-03-25T02:00:55.621079328Z" level=error msg="Failed to destroy network for sandbox \"219b8bcc07c23e51eb3debe34c710991fb418a6c00f3ae0e121d306fb7480a21\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.621829 containerd[1523]: time="2025-03-25T02:00:55.621681725Z" level=error msg="Failed to destroy network for sandbox \"fb85209a074c990d08ad3dad27a5730c8b709d986aaf5ebb9844046ee44aaf99\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.623920 containerd[1523]: time="2025-03-25T02:00:55.623813817Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6s6kn,Uid:ff374ecd-e367-41b4-8d7a-74ac7ff2882d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"219b8bcc07c23e51eb3debe34c710991fb418a6c00f3ae0e121d306fb7480a21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.624343 kubelet[2782]: E0325 02:00:55.624183 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"219b8bcc07c23e51eb3debe34c710991fb418a6c00f3ae0e121d306fb7480a21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.624343 kubelet[2782]: E0325 02:00:55.624287 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"219b8bcc07c23e51eb3debe34c710991fb418a6c00f3ae0e121d306fb7480a21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c6dff454b-6s6kn" Mar 25 02:00:55.624976 kubelet[2782]: E0325 02:00:55.624322 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"219b8bcc07c23e51eb3debe34c710991fb418a6c00f3ae0e121d306fb7480a21\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c6dff454b-6s6kn" Mar 25 02:00:55.624976 kubelet[2782]: E0325 02:00:55.624517 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c6dff454b-6s6kn_calico-apiserver(ff374ecd-e367-41b4-8d7a-74ac7ff2882d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c6dff454b-6s6kn_calico-apiserver(ff374ecd-e367-41b4-8d7a-74ac7ff2882d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"219b8bcc07c23e51eb3debe34c710991fb418a6c00f3ae0e121d306fb7480a21\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c6dff454b-6s6kn" podUID="ff374ecd-e367-41b4-8d7a-74ac7ff2882d" Mar 25 02:00:55.625732 containerd[1523]: time="2025-03-25T02:00:55.625650361Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-f58bh,Uid:d129c3b4-f6a0-4232-9067-911267ef131b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb85209a074c990d08ad3dad27a5730c8b709d986aaf5ebb9844046ee44aaf99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.626525 kubelet[2782]: E0325 02:00:55.625892 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb85209a074c990d08ad3dad27a5730c8b709d986aaf5ebb9844046ee44aaf99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.626525 kubelet[2782]: E0325 02:00:55.625952 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb85209a074c990d08ad3dad27a5730c8b709d986aaf5ebb9844046ee44aaf99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-f58bh" Mar 25 02:00:55.626525 kubelet[2782]: E0325 02:00:55.625977 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb85209a074c990d08ad3dad27a5730c8b709d986aaf5ebb9844046ee44aaf99\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-f58bh" Mar 25 02:00:55.626879 kubelet[2782]: E0325 02:00:55.626016 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-f58bh_kube-system(d129c3b4-f6a0-4232-9067-911267ef131b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-f58bh_kube-system(d129c3b4-f6a0-4232-9067-911267ef131b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fb85209a074c990d08ad3dad27a5730c8b709d986aaf5ebb9844046ee44aaf99\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-f58bh" podUID="d129c3b4-f6a0-4232-9067-911267ef131b" Mar 25 02:00:55.645748 containerd[1523]: time="2025-03-25T02:00:55.645679662Z" level=error msg="Failed to destroy network for sandbox \"db94c4bba4254784e74e983d706246742f02d9bbb464e6645415aef5ba08d2d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.647804 containerd[1523]: time="2025-03-25T02:00:55.647655996Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b9d7665cb-7crdx,Uid:760bc0b6-71e1-4be3-8d8e-ceb103060e98,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"db94c4bba4254784e74e983d706246742f02d9bbb464e6645415aef5ba08d2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.648307 kubelet[2782]: E0325 02:00:55.648124 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db94c4bba4254784e74e983d706246742f02d9bbb464e6645415aef5ba08d2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.648307 kubelet[2782]: E0325 02:00:55.648294 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db94c4bba4254784e74e983d706246742f02d9bbb464e6645415aef5ba08d2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" Mar 25 02:00:55.648515 kubelet[2782]: E0325 02:00:55.648327 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db94c4bba4254784e74e983d706246742f02d9bbb464e6645415aef5ba08d2d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" Mar 25 02:00:55.648515 kubelet[2782]: E0325 02:00:55.648398 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5b9d7665cb-7crdx_calico-system(760bc0b6-71e1-4be3-8d8e-ceb103060e98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5b9d7665cb-7crdx_calico-system(760bc0b6-71e1-4be3-8d8e-ceb103060e98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db94c4bba4254784e74e983d706246742f02d9bbb464e6645415aef5ba08d2d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" podUID="760bc0b6-71e1-4be3-8d8e-ceb103060e98" Mar 25 02:00:55.649736 containerd[1523]: time="2025-03-25T02:00:55.649671302Z" level=error msg="Failed to destroy network for sandbox \"895961b65977ebd3eaa3b1cb1f49b4d818aa3762e38b2f53708f3a5329472a88\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.651027 containerd[1523]: time="2025-03-25T02:00:55.650879929Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6m8j9,Uid:7d76ff3d-f174-4f4c-9d88-240590ad06da,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"895961b65977ebd3eaa3b1cb1f49b4d818aa3762e38b2f53708f3a5329472a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.651518 kubelet[2782]: E0325 02:00:55.651310 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895961b65977ebd3eaa3b1cb1f49b4d818aa3762e38b2f53708f3a5329472a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:00:55.651518 kubelet[2782]: E0325 02:00:55.651355 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895961b65977ebd3eaa3b1cb1f49b4d818aa3762e38b2f53708f3a5329472a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" Mar 25 02:00:55.651518 kubelet[2782]: E0325 02:00:55.651380 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"895961b65977ebd3eaa3b1cb1f49b4d818aa3762e38b2f53708f3a5329472a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" Mar 25 02:00:55.651906 kubelet[2782]: E0325 02:00:55.651450 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c6dff454b-6m8j9_calico-apiserver(7d76ff3d-f174-4f4c-9d88-240590ad06da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c6dff454b-6m8j9_calico-apiserver(7d76ff3d-f174-4f4c-9d88-240590ad06da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"895961b65977ebd3eaa3b1cb1f49b4d818aa3762e38b2f53708f3a5329472a88\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" podUID="7d76ff3d-f174-4f4c-9d88-240590ad06da" Mar 25 02:00:55.664703 systemd[1]: run-netns-cni\x2d25004f4f\x2d699f\x2da240\x2d2fdc\x2d06a8869fd93d.mount: Deactivated successfully. Mar 25 02:01:06.028649 containerd[1523]: time="2025-03-25T02:01:06.028405541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6m8j9,Uid:7d76ff3d-f174-4f4c-9d88-240590ad06da,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:01:06.438385 containerd[1523]: time="2025-03-25T02:01:06.438305968Z" level=error msg="Failed to destroy network for sandbox \"d3013abc4ddd0658a45a4df0ab82d4efd4b51d3b3e5a47c52946ba1d85175c8f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:06.446706 systemd[1]: run-netns-cni\x2d9fbfe53e\x2d24b0\x2d8bcd\x2d6134\x2db0f9931de899.mount: Deactivated successfully. Mar 25 02:01:06.451828 containerd[1523]: time="2025-03-25T02:01:06.451018568Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6m8j9,Uid:7d76ff3d-f174-4f4c-9d88-240590ad06da,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3013abc4ddd0658a45a4df0ab82d4efd4b51d3b3e5a47c52946ba1d85175c8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:06.453113 kubelet[2782]: E0325 02:01:06.451712 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3013abc4ddd0658a45a4df0ab82d4efd4b51d3b3e5a47c52946ba1d85175c8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:06.453113 kubelet[2782]: E0325 02:01:06.452890 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3013abc4ddd0658a45a4df0ab82d4efd4b51d3b3e5a47c52946ba1d85175c8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" Mar 25 02:01:06.453113 kubelet[2782]: E0325 02:01:06.453033 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d3013abc4ddd0658a45a4df0ab82d4efd4b51d3b3e5a47c52946ba1d85175c8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" Mar 25 02:01:06.455685 kubelet[2782]: E0325 02:01:06.454169 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7c6dff454b-6m8j9_calico-apiserver(7d76ff3d-f174-4f4c-9d88-240590ad06da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7c6dff454b-6m8j9_calico-apiserver(7d76ff3d-f174-4f4c-9d88-240590ad06da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d3013abc4ddd0658a45a4df0ab82d4efd4b51d3b3e5a47c52946ba1d85175c8f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" podUID="7d76ff3d-f174-4f4c-9d88-240590ad06da" Mar 25 02:01:06.983228 containerd[1523]: time="2025-03-25T02:01:06.981869746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-f58bh,Uid:d129c3b4-f6a0-4232-9067-911267ef131b,Namespace:kube-system,Attempt:0,}" Mar 25 02:01:07.169328 containerd[1523]: time="2025-03-25T02:01:07.168937780Z" level=error msg="Failed to destroy network for sandbox \"341f1af50b1e46adb7b0a560fc905c5be308365fd321bc1ac8656b71f7170f1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:07.171237 containerd[1523]: time="2025-03-25T02:01:07.171095802Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-f58bh,Uid:d129c3b4-f6a0-4232-9067-911267ef131b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"341f1af50b1e46adb7b0a560fc905c5be308365fd321bc1ac8656b71f7170f1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:07.174646 kubelet[2782]: E0325 02:01:07.174390 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"341f1af50b1e46adb7b0a560fc905c5be308365fd321bc1ac8656b71f7170f1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:07.174646 kubelet[2782]: E0325 02:01:07.174590 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"341f1af50b1e46adb7b0a560fc905c5be308365fd321bc1ac8656b71f7170f1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-f58bh" Mar 25 02:01:07.174516 systemd[1]: run-netns-cni\x2d94fea809\x2df6e5\x2d6fe2\x2d9be0\x2d4dd5f3a374ba.mount: Deactivated successfully. Mar 25 02:01:07.175743 kubelet[2782]: E0325 02:01:07.174992 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"341f1af50b1e46adb7b0a560fc905c5be308365fd321bc1ac8656b71f7170f1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-f58bh" Mar 25 02:01:07.177733 kubelet[2782]: E0325 02:01:07.175131 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-f58bh_kube-system(d129c3b4-f6a0-4232-9067-911267ef131b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-f58bh_kube-system(d129c3b4-f6a0-4232-9067-911267ef131b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"341f1af50b1e46adb7b0a560fc905c5be308365fd321bc1ac8656b71f7170f1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-f58bh" podUID="d129c3b4-f6a0-4232-9067-911267ef131b" Mar 25 02:01:07.502407 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount42117431.mount: Deactivated successfully. Mar 25 02:01:07.628515 containerd[1523]: time="2025-03-25T02:01:07.628216908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:07.657188 containerd[1523]: time="2025-03-25T02:01:07.631648444Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 25 02:01:07.657476 containerd[1523]: time="2025-03-25T02:01:07.640427411Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:07.657999 containerd[1523]: time="2025-03-25T02:01:07.642765328Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 12.126373924s" Mar 25 02:01:07.658776 containerd[1523]: time="2025-03-25T02:01:07.658744802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:07.662952 containerd[1523]: time="2025-03-25T02:01:07.662884593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 25 02:01:07.698445 containerd[1523]: time="2025-03-25T02:01:07.698362797Z" level=info msg="CreateContainer within sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 02:01:07.722135 containerd[1523]: time="2025-03-25T02:01:07.722079636Z" level=info msg="Container b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:07.808758 containerd[1523]: time="2025-03-25T02:01:07.807922721Z" level=info msg="CreateContainer within sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\"" Mar 25 02:01:07.815415 containerd[1523]: time="2025-03-25T02:01:07.815110432Z" level=info msg="StartContainer for \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\"" Mar 25 02:01:07.826957 containerd[1523]: time="2025-03-25T02:01:07.826522154Z" level=info msg="connecting to shim b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27" address="unix:///run/containerd/s/eb9306af8a4d5af306328d3a263c4ef8b2ba566115fb73a562edee2ed0576f83" protocol=ttrpc version=3 Mar 25 02:01:07.985312 containerd[1523]: time="2025-03-25T02:01:07.985244139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfmq7,Uid:b452acd9-0dc2-4c9b-905e-864aa551f46e,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:07.985597 containerd[1523]: time="2025-03-25T02:01:07.985558299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b9d7665cb-7crdx,Uid:760bc0b6-71e1-4be3-8d8e-ceb103060e98,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:08.027821 systemd[1]: Started cri-containerd-b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27.scope - libcontainer container b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27. Mar 25 02:01:08.183936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2886777713.mount: Deactivated successfully. Mar 25 02:01:08.358144 containerd[1523]: time="2025-03-25T02:01:08.358010478Z" level=info msg="StartContainer for \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" returns successfully" Mar 25 02:01:08.375335 containerd[1523]: time="2025-03-25T02:01:08.375141987Z" level=error msg="Failed to destroy network for sandbox \"161b17ca573418db1dae56a770ece405d40cf74503633d7edb1522be90853f42\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:08.383739 systemd[1]: run-netns-cni\x2d6d4e4fed\x2d25e1\x2d52f6\x2de98a\x2db57de32dc8be.mount: Deactivated successfully. Mar 25 02:01:08.403237 containerd[1523]: time="2025-03-25T02:01:08.402852959Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b9d7665cb-7crdx,Uid:760bc0b6-71e1-4be3-8d8e-ceb103060e98,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"161b17ca573418db1dae56a770ece405d40cf74503633d7edb1522be90853f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:08.404146 kubelet[2782]: E0325 02:01:08.404044 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"161b17ca573418db1dae56a770ece405d40cf74503633d7edb1522be90853f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:08.406872 kubelet[2782]: E0325 02:01:08.404197 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"161b17ca573418db1dae56a770ece405d40cf74503633d7edb1522be90853f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" Mar 25 02:01:08.406872 kubelet[2782]: E0325 02:01:08.404246 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"161b17ca573418db1dae56a770ece405d40cf74503633d7edb1522be90853f42\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" Mar 25 02:01:08.406872 kubelet[2782]: E0325 02:01:08.404341 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5b9d7665cb-7crdx_calico-system(760bc0b6-71e1-4be3-8d8e-ceb103060e98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5b9d7665cb-7crdx_calico-system(760bc0b6-71e1-4be3-8d8e-ceb103060e98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"161b17ca573418db1dae56a770ece405d40cf74503633d7edb1522be90853f42\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" podUID="760bc0b6-71e1-4be3-8d8e-ceb103060e98" Mar 25 02:01:08.434115 containerd[1523]: time="2025-03-25T02:01:08.431296032Z" level=error msg="Failed to destroy network for sandbox \"69cc220662f56171de0380457add07a0732a9216701c313c8ed74608dec91131\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:08.439336 systemd[1]: run-netns-cni\x2d0c15ab9d\x2df1bc\x2d9bed\x2db5ea\x2d5c9654660a3e.mount: Deactivated successfully. Mar 25 02:01:08.441334 containerd[1523]: time="2025-03-25T02:01:08.441263746Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfmq7,Uid:b452acd9-0dc2-4c9b-905e-864aa551f46e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"69cc220662f56171de0380457add07a0732a9216701c313c8ed74608dec91131\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:08.441889 kubelet[2782]: E0325 02:01:08.441826 2782 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69cc220662f56171de0380457add07a0732a9216701c313c8ed74608dec91131\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 02:01:08.442047 kubelet[2782]: E0325 02:01:08.441933 2782 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69cc220662f56171de0380457add07a0732a9216701c313c8ed74608dec91131\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:01:08.442047 kubelet[2782]: E0325 02:01:08.441966 2782 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69cc220662f56171de0380457add07a0732a9216701c313c8ed74608dec91131\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rfmq7" Mar 25 02:01:08.442167 kubelet[2782]: E0325 02:01:08.442028 2782 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rfmq7_calico-system(b452acd9-0dc2-4c9b-905e-864aa551f46e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rfmq7_calico-system(b452acd9-0dc2-4c9b-905e-864aa551f46e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69cc220662f56171de0380457add07a0732a9216701c313c8ed74608dec91131\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rfmq7" podUID="b452acd9-0dc2-4c9b-905e-864aa551f46e" Mar 25 02:01:08.540119 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 02:01:08.542659 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 02:01:08.647566 kubelet[2782]: I0325 02:01:08.619016 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-rjwwt" podStartSLOduration=2.336873762 podStartE2EDuration="34.60262228s" podCreationTimestamp="2025-03-25 02:00:34 +0000 UTC" firstStartedPulling="2025-03-25 02:00:35.398668947 +0000 UTC m=+12.763636199" lastFinishedPulling="2025-03-25 02:01:07.664417465 +0000 UTC m=+45.029384717" observedRunningTime="2025-03-25 02:01:08.596712541 +0000 UTC m=+45.961679802" watchObservedRunningTime="2025-03-25 02:01:08.60262228 +0000 UTC m=+45.967589540" Mar 25 02:01:09.144992 containerd[1523]: time="2025-03-25T02:01:09.144823047Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"1f051dcf45f66bb0baeabc31171f0df0a493696bf33d4200d4c62ce78a2f301a\" pid:3851 exit_status:1 exited_at:{seconds:1742868069 nanos:142234764}" Mar 25 02:01:09.692087 containerd[1523]: time="2025-03-25T02:01:09.692013839Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"ed4ba4a4d6d5c5d9e21be3e0897650685c95fc67ba1ed79d89b02bb308417846\" pid:3887 exit_status:1 exited_at:{seconds:1742868069 nanos:690870368}" Mar 25 02:01:09.974584 containerd[1523]: time="2025-03-25T02:01:09.974119654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6s6kn,Uid:ff374ecd-e367-41b4-8d7a-74ac7ff2882d,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:01:09.975069 containerd[1523]: time="2025-03-25T02:01:09.974125006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2zw92,Uid:d73724c5-eafb-4ad9-a2d7-760fb2b955a3,Namespace:kube-system,Attempt:0,}" Mar 25 02:01:10.529165 systemd-networkd[1448]: cali4408b8ec14e: Link UP Mar 25 02:01:10.535249 systemd-networkd[1448]: cali4408b8ec14e: Gained carrier Mar 25 02:01:10.538349 containerd[1523]: 2025-03-25 02:01:10.109 [INFO][3898] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 02:01:10.538349 containerd[1523]: 2025-03-25 02:01:10.171 [INFO][3898] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0 calico-apiserver-7c6dff454b- calico-apiserver ff374ecd-e367-41b4-8d7a-74ac7ff2882d 727 0 2025-03-25 02:00:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7c6dff454b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com calico-apiserver-7c6dff454b-6s6kn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4408b8ec14e [] []}} ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-" Mar 25 02:01:10.538349 containerd[1523]: 2025-03-25 02:01:10.172 [INFO][3898] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.538349 containerd[1523]: 2025-03-25 02:01:10.414 [INFO][3924] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" HandleID="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.443 [INFO][3924] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" HandleID="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000264a80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-u0apo.gb1.brightbox.com", "pod":"calico-apiserver-7c6dff454b-6s6kn", "timestamp":"2025-03-25 02:01:10.414743392 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.443 [INFO][3924] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.443 [INFO][3924] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.443 [INFO][3924] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.446 [INFO][3924] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.457 [INFO][3924] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.465 [INFO][3924] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.468 [INFO][3924] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.543904 containerd[1523]: 2025-03-25 02:01:10.471 [INFO][3924] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.471 [INFO][3924] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.473 [INFO][3924] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08 Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.483 [INFO][3924] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.491 [INFO][3924] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.65/26] block=192.168.116.64/26 handle="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.491 [INFO][3924] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.65/26] handle="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.492 [INFO][3924] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:10.573972 containerd[1523]: 2025-03-25 02:01:10.492 [INFO][3924] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.65/26] IPv6=[] ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" HandleID="k8s-pod-network.73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.574329 containerd[1523]: 2025-03-25 02:01:10.496 [INFO][3898] cni-plugin/k8s.go 386: Populated endpoint ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0", GenerateName:"calico-apiserver-7c6dff454b-", Namespace:"calico-apiserver", SelfLink:"", UID:"ff374ecd-e367-41b4-8d7a-74ac7ff2882d", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c6dff454b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-7c6dff454b-6s6kn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4408b8ec14e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:10.574444 containerd[1523]: 2025-03-25 02:01:10.497 [INFO][3898] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.65/32] ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.574444 containerd[1523]: 2025-03-25 02:01:10.497 [INFO][3898] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4408b8ec14e ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.574444 containerd[1523]: 2025-03-25 02:01:10.511 [INFO][3898] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.574688 containerd[1523]: 2025-03-25 02:01:10.512 [INFO][3898] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0", GenerateName:"calico-apiserver-7c6dff454b-", Namespace:"calico-apiserver", SelfLink:"", UID:"ff374ecd-e367-41b4-8d7a-74ac7ff2882d", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c6dff454b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08", Pod:"calico-apiserver-7c6dff454b-6s6kn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4408b8ec14e", MAC:"06:43:37:3b:f9:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:10.574806 containerd[1523]: 2025-03-25 02:01:10.527 [INFO][3898] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6s6kn" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6s6kn-eth0" Mar 25 02:01:10.731416 systemd-networkd[1448]: cali4ad7828512e: Link UP Mar 25 02:01:10.731827 systemd-networkd[1448]: cali4ad7828512e: Gained carrier Mar 25 02:01:10.766458 containerd[1523]: 2025-03-25 02:01:10.113 [INFO][3904] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 02:01:10.766458 containerd[1523]: 2025-03-25 02:01:10.171 [INFO][3904] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0 coredns-668d6bf9bc- kube-system d73724c5-eafb-4ad9-a2d7-760fb2b955a3 720 0 2025-03-25 02:00:26 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com coredns-668d6bf9bc-2zw92 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4ad7828512e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-" Mar 25 02:01:10.766458 containerd[1523]: 2025-03-25 02:01:10.172 [INFO][3904] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.766458 containerd[1523]: 2025-03-25 02:01:10.414 [INFO][3925] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" HandleID="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Workload="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.444 [INFO][3925] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" HandleID="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Workload="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003f08c0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-u0apo.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-2zw92", "timestamp":"2025-03-25 02:01:10.414745532 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.444 [INFO][3925] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.493 [INFO][3925] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.494 [INFO][3925] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.549 [INFO][3925] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.569 [INFO][3925] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.592 [INFO][3925] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.607 [INFO][3925] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.767266 containerd[1523]: 2025-03-25 02:01:10.620 [INFO][3925] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.622 [INFO][3925] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.632 [INFO][3925] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.664 [INFO][3925] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.689 [INFO][3925] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.66/26] block=192.168.116.64/26 handle="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.689 [INFO][3925] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.66/26] handle="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.690 [INFO][3925] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:10.769703 containerd[1523]: 2025-03-25 02:01:10.690 [INFO][3925] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.66/26] IPv6=[] ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" HandleID="k8s-pod-network.1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Workload="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.770033 containerd[1523]: 2025-03-25 02:01:10.719 [INFO][3904] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d73724c5-eafb-4ad9-a2d7-760fb2b955a3", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-2zw92", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4ad7828512e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:10.770033 containerd[1523]: 2025-03-25 02:01:10.719 [INFO][3904] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.66/32] ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.770033 containerd[1523]: 2025-03-25 02:01:10.719 [INFO][3904] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4ad7828512e ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.770033 containerd[1523]: 2025-03-25 02:01:10.728 [INFO][3904] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.770033 containerd[1523]: 2025-03-25 02:01:10.728 [INFO][3904] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d73724c5-eafb-4ad9-a2d7-760fb2b955a3", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c", Pod:"coredns-668d6bf9bc-2zw92", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4ad7828512e", MAC:"c6:c9:64:40:2c:51", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:10.770033 containerd[1523]: 2025-03-25 02:01:10.760 [INFO][3904] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" Namespace="kube-system" Pod="coredns-668d6bf9bc-2zw92" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--2zw92-eth0" Mar 25 02:01:10.833416 containerd[1523]: time="2025-03-25T02:01:10.832771577Z" level=info msg="connecting to shim 73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08" address="unix:///run/containerd/s/4df55b3c71f74389210f5475f2746269bd43168dbf2403d3683bcd67e186993f" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:10.836907 containerd[1523]: time="2025-03-25T02:01:10.836855648Z" level=info msg="connecting to shim 1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c" address="unix:///run/containerd/s/7393db99679ff2335411fa22d616f2c801bd8866293854b7292e06a2ef1cbdb3" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:10.961710 systemd[1]: Started cri-containerd-73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08.scope - libcontainer container 73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08. Mar 25 02:01:11.019346 systemd[1]: Started cri-containerd-1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c.scope - libcontainer container 1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c. Mar 25 02:01:11.154276 containerd[1523]: time="2025-03-25T02:01:11.154200078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2zw92,Uid:d73724c5-eafb-4ad9-a2d7-760fb2b955a3,Namespace:kube-system,Attempt:0,} returns sandbox id \"1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c\"" Mar 25 02:01:11.170062 containerd[1523]: time="2025-03-25T02:01:11.169604208Z" level=info msg="CreateContainer within sandbox \"1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 02:01:11.191723 containerd[1523]: time="2025-03-25T02:01:11.191662289Z" level=info msg="Container 3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:11.201369 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1072962428.mount: Deactivated successfully. Mar 25 02:01:11.212563 containerd[1523]: time="2025-03-25T02:01:11.211359943Z" level=info msg="CreateContainer within sandbox \"1e1091a3267d5eac422cb5e76c0a367205d0d7c677fafcc342d87d9d9ea9ac5c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d\"" Mar 25 02:01:11.216571 containerd[1523]: time="2025-03-25T02:01:11.215678321Z" level=info msg="StartContainer for \"3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d\"" Mar 25 02:01:11.225675 containerd[1523]: time="2025-03-25T02:01:11.225492923Z" level=info msg="connecting to shim 3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d" address="unix:///run/containerd/s/7393db99679ff2335411fa22d616f2c801bd8866293854b7292e06a2ef1cbdb3" protocol=ttrpc version=3 Mar 25 02:01:11.278575 systemd[1]: Started cri-containerd-3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d.scope - libcontainer container 3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d. Mar 25 02:01:11.369609 containerd[1523]: time="2025-03-25T02:01:11.369114188Z" level=info msg="StartContainer for \"3b8053ae30fb0167d7004362ea81a138a48adede3b5a36c6b0884f7cbfa3dc2d\" returns successfully" Mar 25 02:01:11.428696 containerd[1523]: time="2025-03-25T02:01:11.428003224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6s6kn,Uid:ff374ecd-e367-41b4-8d7a-74ac7ff2882d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08\"" Mar 25 02:01:11.435698 containerd[1523]: time="2025-03-25T02:01:11.435656195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 02:01:11.486496 containerd[1523]: time="2025-03-25T02:01:11.486319472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"330c171e8453c76087e7b8a12a65a43ac0b8c0746053ee91bd77d246557689ae\" pid:4045 exit_status:1 exited_at:{seconds:1742868071 nanos:484242896}" Mar 25 02:01:11.713604 kernel: bpftool[4231]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 02:01:12.133471 systemd-networkd[1448]: vxlan.calico: Link UP Mar 25 02:01:12.133484 systemd-networkd[1448]: vxlan.calico: Gained carrier Mar 25 02:01:12.362936 systemd-networkd[1448]: cali4408b8ec14e: Gained IPv6LL Mar 25 02:01:12.633167 kubelet[2782]: I0325 02:01:12.632102 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-2zw92" podStartSLOduration=46.632033356 podStartE2EDuration="46.632033356s" podCreationTimestamp="2025-03-25 02:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:01:11.637664157 +0000 UTC m=+49.002631427" watchObservedRunningTime="2025-03-25 02:01:12.632033356 +0000 UTC m=+49.997000615" Mar 25 02:01:12.747282 systemd-networkd[1448]: cali4ad7828512e: Gained IPv6LL Mar 25 02:01:14.029102 systemd-networkd[1448]: vxlan.calico: Gained IPv6LL Mar 25 02:01:16.558780 containerd[1523]: time="2025-03-25T02:01:16.558483298Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:16.560615 containerd[1523]: time="2025-03-25T02:01:16.560515210Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 25 02:01:16.561569 containerd[1523]: time="2025-03-25T02:01:16.561364347Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:16.565933 containerd[1523]: time="2025-03-25T02:01:16.565896523Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:16.568085 containerd[1523]: time="2025-03-25T02:01:16.567255247Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 5.131525987s" Mar 25 02:01:16.568085 containerd[1523]: time="2025-03-25T02:01:16.567351949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 02:01:16.575037 containerd[1523]: time="2025-03-25T02:01:16.574942510Z" level=info msg="CreateContainer within sandbox \"73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 02:01:16.591905 containerd[1523]: time="2025-03-25T02:01:16.591859697Z" level=info msg="Container a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:16.601574 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1660301075.mount: Deactivated successfully. Mar 25 02:01:16.609126 containerd[1523]: time="2025-03-25T02:01:16.609055407Z" level=info msg="CreateContainer within sandbox \"73694eaa209df849f6c15903633be847f6a16d0fb57f6bd1e72a7482d1e9fc08\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f\"" Mar 25 02:01:16.612595 containerd[1523]: time="2025-03-25T02:01:16.611648207Z" level=info msg="StartContainer for \"a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f\"" Mar 25 02:01:16.613646 containerd[1523]: time="2025-03-25T02:01:16.613608025Z" level=info msg="connecting to shim a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f" address="unix:///run/containerd/s/4df55b3c71f74389210f5475f2746269bd43168dbf2403d3683bcd67e186993f" protocol=ttrpc version=3 Mar 25 02:01:16.675462 systemd[1]: Started cri-containerd-a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f.scope - libcontainer container a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f. Mar 25 02:01:16.777726 containerd[1523]: time="2025-03-25T02:01:16.777671589Z" level=info msg="StartContainer for \"a54def74b64921db04ebed4bc490f081c4ddcd36cfc40ed241092d9e3c73c26f\" returns successfully" Mar 25 02:01:16.976346 containerd[1523]: time="2025-03-25T02:01:16.976269784Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6m8j9,Uid:7d76ff3d-f174-4f4c-9d88-240590ad06da,Namespace:calico-apiserver,Attempt:0,}" Mar 25 02:01:17.219358 systemd-networkd[1448]: calic4edf12a31f: Link UP Mar 25 02:01:17.219832 systemd-networkd[1448]: calic4edf12a31f: Gained carrier Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.069 [INFO][4361] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0 calico-apiserver-7c6dff454b- calico-apiserver 7d76ff3d-f174-4f4c-9d88-240590ad06da 725 0 2025-03-25 02:00:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7c6dff454b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com calico-apiserver-7c6dff454b-6m8j9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic4edf12a31f [] []}} ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.070 [INFO][4361] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.139 [INFO][4372] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" HandleID="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.156 [INFO][4372] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" HandleID="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ed460), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-u0apo.gb1.brightbox.com", "pod":"calico-apiserver-7c6dff454b-6m8j9", "timestamp":"2025-03-25 02:01:17.139407571 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.156 [INFO][4372] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.156 [INFO][4372] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.156 [INFO][4372] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.160 [INFO][4372] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.169 [INFO][4372] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.176 [INFO][4372] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.179 [INFO][4372] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.183 [INFO][4372] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.183 [INFO][4372] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.185 [INFO][4372] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552 Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.195 [INFO][4372] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.207 [INFO][4372] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.67/26] block=192.168.116.64/26 handle="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.208 [INFO][4372] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.67/26] handle="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.208 [INFO][4372] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:17.259979 containerd[1523]: 2025-03-25 02:01:17.208 [INFO][4372] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.67/26] IPv6=[] ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" HandleID="k8s-pod-network.5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.264497 containerd[1523]: 2025-03-25 02:01:17.210 [INFO][4361] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0", GenerateName:"calico-apiserver-7c6dff454b-", Namespace:"calico-apiserver", SelfLink:"", UID:"7d76ff3d-f174-4f4c-9d88-240590ad06da", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c6dff454b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-7c6dff454b-6m8j9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic4edf12a31f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:17.264497 containerd[1523]: 2025-03-25 02:01:17.210 [INFO][4361] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.67/32] ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.264497 containerd[1523]: 2025-03-25 02:01:17.210 [INFO][4361] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic4edf12a31f ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.264497 containerd[1523]: 2025-03-25 02:01:17.220 [INFO][4361] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.264497 containerd[1523]: 2025-03-25 02:01:17.221 [INFO][4361] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0", GenerateName:"calico-apiserver-7c6dff454b-", Namespace:"calico-apiserver", SelfLink:"", UID:"7d76ff3d-f174-4f4c-9d88-240590ad06da", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7c6dff454b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552", Pod:"calico-apiserver-7c6dff454b-6m8j9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic4edf12a31f", MAC:"fe:a5:c2:03:37:53", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:17.264497 containerd[1523]: 2025-03-25 02:01:17.247 [INFO][4361] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" Namespace="calico-apiserver" Pod="calico-apiserver-7c6dff454b-6m8j9" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--apiserver--7c6dff454b--6m8j9-eth0" Mar 25 02:01:17.321454 containerd[1523]: time="2025-03-25T02:01:17.320987560Z" level=info msg="connecting to shim 5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552" address="unix:///run/containerd/s/638326288d13561badb2176de0424201420b8a94b82132d8fc32c0e23b067dd1" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:17.369903 systemd[1]: Started cri-containerd-5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552.scope - libcontainer container 5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552. Mar 25 02:01:17.471581 containerd[1523]: time="2025-03-25T02:01:17.468829085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7c6dff454b-6m8j9,Uid:7d76ff3d-f174-4f4c-9d88-240590ad06da,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552\"" Mar 25 02:01:17.479212 containerd[1523]: time="2025-03-25T02:01:17.479152561Z" level=info msg="CreateContainer within sandbox \"5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 02:01:17.496170 containerd[1523]: time="2025-03-25T02:01:17.496108540Z" level=info msg="Container 4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:17.503643 containerd[1523]: time="2025-03-25T02:01:17.503602176Z" level=info msg="CreateContainer within sandbox \"5264da98c837f7e7dca108caef3b2a922cfbd32724f4a6229423d4280c2fa552\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1\"" Mar 25 02:01:17.506995 containerd[1523]: time="2025-03-25T02:01:17.506962776Z" level=info msg="StartContainer for \"4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1\"" Mar 25 02:01:17.509473 containerd[1523]: time="2025-03-25T02:01:17.509289374Z" level=info msg="connecting to shim 4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1" address="unix:///run/containerd/s/638326288d13561badb2176de0424201420b8a94b82132d8fc32c0e23b067dd1" protocol=ttrpc version=3 Mar 25 02:01:17.546322 systemd[1]: Started cri-containerd-4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1.scope - libcontainer container 4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1. Mar 25 02:01:17.700962 kubelet[2782]: I0325 02:01:17.699065 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7c6dff454b-6s6kn" podStartSLOduration=37.562607867 podStartE2EDuration="42.698563313s" podCreationTimestamp="2025-03-25 02:00:35 +0000 UTC" firstStartedPulling="2025-03-25 02:01:11.434046658 +0000 UTC m=+48.799013910" lastFinishedPulling="2025-03-25 02:01:16.570002112 +0000 UTC m=+53.934969356" observedRunningTime="2025-03-25 02:01:17.694322218 +0000 UTC m=+55.059289481" watchObservedRunningTime="2025-03-25 02:01:17.698563313 +0000 UTC m=+55.063530575" Mar 25 02:01:17.703249 containerd[1523]: time="2025-03-25T02:01:17.699469194Z" level=info msg="StartContainer for \"4f4e2863e2abf2ea720e7d0e60126e8385c96ece3a5911f45f788eae79ec88f1\" returns successfully" Mar 25 02:01:18.699024 systemd-networkd[1448]: calic4edf12a31f: Gained IPv6LL Mar 25 02:01:18.977440 containerd[1523]: time="2025-03-25T02:01:18.976780653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-f58bh,Uid:d129c3b4-f6a0-4232-9067-911267ef131b,Namespace:kube-system,Attempt:0,}" Mar 25 02:01:19.199082 systemd-networkd[1448]: cali3f15eea863c: Link UP Mar 25 02:01:19.200524 systemd-networkd[1448]: cali3f15eea863c: Gained carrier Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.071 [INFO][4486] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0 coredns-668d6bf9bc- kube-system d129c3b4-f6a0-4232-9067-911267ef131b 726 0 2025-03-25 02:00:26 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com coredns-668d6bf9bc-f58bh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3f15eea863c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.071 [INFO][4486] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.134 [INFO][4498] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" HandleID="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Workload="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.149 [INFO][4498] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" HandleID="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Workload="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030f940), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-u0apo.gb1.brightbox.com", "pod":"coredns-668d6bf9bc-f58bh", "timestamp":"2025-03-25 02:01:19.134189077 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.149 [INFO][4498] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.149 [INFO][4498] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.149 [INFO][4498] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.152 [INFO][4498] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.157 [INFO][4498] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.164 [INFO][4498] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.167 [INFO][4498] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.170 [INFO][4498] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.170 [INFO][4498] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.173 [INFO][4498] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9 Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.180 [INFO][4498] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.190 [INFO][4498] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.68/26] block=192.168.116.64/26 handle="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.190 [INFO][4498] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.68/26] handle="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.190 [INFO][4498] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:19.229641 containerd[1523]: 2025-03-25 02:01:19.190 [INFO][4498] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.68/26] IPv6=[] ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" HandleID="k8s-pod-network.5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Workload="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.234096 kubelet[2782]: I0325 02:01:19.228826 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7c6dff454b-6m8j9" podStartSLOduration=44.228683399 podStartE2EDuration="44.228683399s" podCreationTimestamp="2025-03-25 02:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:01:18.742782487 +0000 UTC m=+56.107749753" watchObservedRunningTime="2025-03-25 02:01:19.228683399 +0000 UTC m=+56.593650656" Mar 25 02:01:19.236590 containerd[1523]: 2025-03-25 02:01:19.194 [INFO][4486] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d129c3b4-f6a0-4232-9067-911267ef131b", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"coredns-668d6bf9bc-f58bh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3f15eea863c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:19.236590 containerd[1523]: 2025-03-25 02:01:19.194 [INFO][4486] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.68/32] ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.236590 containerd[1523]: 2025-03-25 02:01:19.194 [INFO][4486] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3f15eea863c ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.236590 containerd[1523]: 2025-03-25 02:01:19.199 [INFO][4486] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.236590 containerd[1523]: 2025-03-25 02:01:19.200 [INFO][4486] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"d129c3b4-f6a0-4232-9067-911267ef131b", ResourceVersion:"726", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9", Pod:"coredns-668d6bf9bc-f58bh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3f15eea863c", MAC:"8a:1a:81:d4:00:2c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:19.236590 containerd[1523]: 2025-03-25 02:01:19.220 [INFO][4486] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-f58bh" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-coredns--668d6bf9bc--f58bh-eth0" Mar 25 02:01:19.462705 containerd[1523]: time="2025-03-25T02:01:19.462208985Z" level=info msg="connecting to shim 5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9" address="unix:///run/containerd/s/b270ba52049d07f93345f71c31e269ed016f150405e6b71edf51911fd0d899b0" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:19.507820 systemd[1]: Started cri-containerd-5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9.scope - libcontainer container 5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9. Mar 25 02:01:19.576242 containerd[1523]: time="2025-03-25T02:01:19.576187976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-f58bh,Uid:d129c3b4-f6a0-4232-9067-911267ef131b,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9\"" Mar 25 02:01:19.581641 containerd[1523]: time="2025-03-25T02:01:19.581603960Z" level=info msg="CreateContainer within sandbox \"5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 02:01:19.597495 containerd[1523]: time="2025-03-25T02:01:19.597401694Z" level=info msg="Container b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:19.606730 containerd[1523]: time="2025-03-25T02:01:19.606668622Z" level=info msg="CreateContainer within sandbox \"5ff6916ecb94e1df9e633c470224f041f0c8625ae172fca29da35f09487778c9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a\"" Mar 25 02:01:19.607895 containerd[1523]: time="2025-03-25T02:01:19.607822673Z" level=info msg="StartContainer for \"b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a\"" Mar 25 02:01:19.609737 containerd[1523]: time="2025-03-25T02:01:19.609703925Z" level=info msg="connecting to shim b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a" address="unix:///run/containerd/s/b270ba52049d07f93345f71c31e269ed016f150405e6b71edf51911fd0d899b0" protocol=ttrpc version=3 Mar 25 02:01:19.647782 systemd[1]: Started cri-containerd-b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a.scope - libcontainer container b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a. Mar 25 02:01:19.720045 containerd[1523]: time="2025-03-25T02:01:19.719088632Z" level=info msg="StartContainer for \"b8a720f87703e2512ed6fd04cc433e96ad4f027b125a8c1d6ef5204c958ddc8a\" returns successfully" Mar 25 02:01:19.729042 kubelet[2782]: I0325 02:01:19.728746 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:01:20.750733 kubelet[2782]: I0325 02:01:20.750428 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-f58bh" podStartSLOduration=54.75038012 podStartE2EDuration="54.75038012s" podCreationTimestamp="2025-03-25 02:00:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:01:20.748599062 +0000 UTC m=+58.113566324" watchObservedRunningTime="2025-03-25 02:01:20.75038012 +0000 UTC m=+58.115347378" Mar 25 02:01:20.978345 containerd[1523]: time="2025-03-25T02:01:20.978121237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b9d7665cb-7crdx,Uid:760bc0b6-71e1-4be3-8d8e-ceb103060e98,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:21.130816 systemd-networkd[1448]: cali3f15eea863c: Gained IPv6LL Mar 25 02:01:21.184489 systemd-networkd[1448]: cali99a7b52eb6a: Link UP Mar 25 02:01:21.185494 systemd-networkd[1448]: cali99a7b52eb6a: Gained carrier Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.059 [INFO][4601] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0 calico-kube-controllers-5b9d7665cb- calico-system 760bc0b6-71e1-4be3-8d8e-ceb103060e98 724 0 2025-03-25 02:00:35 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5b9d7665cb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com calico-kube-controllers-5b9d7665cb-7crdx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali99a7b52eb6a [] []}} ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.059 [INFO][4601] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.113 [INFO][4613] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.128 [INFO][4613] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000291c70), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-u0apo.gb1.brightbox.com", "pod":"calico-kube-controllers-5b9d7665cb-7crdx", "timestamp":"2025-03-25 02:01:21.113799595 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.128 [INFO][4613] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.128 [INFO][4613] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.128 [INFO][4613] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.132 [INFO][4613] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.139 [INFO][4613] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.146 [INFO][4613] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.150 [INFO][4613] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.154 [INFO][4613] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.154 [INFO][4613] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.156 [INFO][4613] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0 Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.164 [INFO][4613] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.175 [INFO][4613] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.69/26] block=192.168.116.64/26 handle="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.175 [INFO][4613] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.69/26] handle="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.175 [INFO][4613] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:21.209808 containerd[1523]: 2025-03-25 02:01:21.175 [INFO][4613] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.69/26] IPv6=[] ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.213096 containerd[1523]: 2025-03-25 02:01:21.179 [INFO][4601] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0", GenerateName:"calico-kube-controllers-5b9d7665cb-", Namespace:"calico-system", SelfLink:"", UID:"760bc0b6-71e1-4be3-8d8e-ceb103060e98", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5b9d7665cb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-5b9d7665cb-7crdx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali99a7b52eb6a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:21.213096 containerd[1523]: 2025-03-25 02:01:21.179 [INFO][4601] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.69/32] ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.213096 containerd[1523]: 2025-03-25 02:01:21.179 [INFO][4601] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali99a7b52eb6a ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.213096 containerd[1523]: 2025-03-25 02:01:21.186 [INFO][4601] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.213096 containerd[1523]: 2025-03-25 02:01:21.187 [INFO][4601] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0", GenerateName:"calico-kube-controllers-5b9d7665cb-", Namespace:"calico-system", SelfLink:"", UID:"760bc0b6-71e1-4be3-8d8e-ceb103060e98", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5b9d7665cb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0", Pod:"calico-kube-controllers-5b9d7665cb-7crdx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali99a7b52eb6a", MAC:"ee:90:d0:52:1f:04", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:21.213096 containerd[1523]: 2025-03-25 02:01:21.206 [INFO][4601] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Namespace="calico-system" Pod="calico-kube-controllers-5b9d7665cb-7crdx" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:21.258051 containerd[1523]: time="2025-03-25T02:01:21.257500576Z" level=info msg="connecting to shim 0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" address="unix:///run/containerd/s/8b55140683536c8243565411ee2489c2dd979343712ae1853a6d8bda9a2fda0b" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:21.302805 systemd[1]: Started cri-containerd-0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0.scope - libcontainer container 0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0. Mar 25 02:01:21.387525 containerd[1523]: time="2025-03-25T02:01:21.387321059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5b9d7665cb-7crdx,Uid:760bc0b6-71e1-4be3-8d8e-ceb103060e98,Namespace:calico-system,Attempt:0,} returns sandbox id \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\"" Mar 25 02:01:21.391594 containerd[1523]: time="2025-03-25T02:01:21.391564433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 02:01:22.989039 containerd[1523]: time="2025-03-25T02:01:22.988978428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfmq7,Uid:b452acd9-0dc2-4c9b-905e-864aa551f46e,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:23.050849 systemd-networkd[1448]: cali99a7b52eb6a: Gained IPv6LL Mar 25 02:01:23.324580 systemd-networkd[1448]: calid0404421668: Link UP Mar 25 02:01:23.327065 systemd-networkd[1448]: calid0404421668: Gained carrier Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.102 [INFO][4686] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0 csi-node-driver- calico-system b452acd9-0dc2-4c9b-905e-864aa551f46e 579 0 2025-03-25 02:00:34 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com csi-node-driver-rfmq7 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid0404421668 [] []}} ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.102 [INFO][4686] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.220 [INFO][4700] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" HandleID="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Workload="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.246 [INFO][4700] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" HandleID="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Workload="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051350), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-u0apo.gb1.brightbox.com", "pod":"csi-node-driver-rfmq7", "timestamp":"2025-03-25 02:01:23.220392324 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.246 [INFO][4700] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.248 [INFO][4700] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.248 [INFO][4700] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.251 [INFO][4700] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.260 [INFO][4700] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.268 [INFO][4700] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.273 [INFO][4700] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.277 [INFO][4700] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.277 [INFO][4700] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.280 [INFO][4700] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2 Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.290 [INFO][4700] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.312 [INFO][4700] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.70/26] block=192.168.116.64/26 handle="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.315 [INFO][4700] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.70/26] handle="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.315 [INFO][4700] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:23.369005 containerd[1523]: 2025-03-25 02:01:23.315 [INFO][4700] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.70/26] IPv6=[] ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" HandleID="k8s-pod-network.6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Workload="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.371002 containerd[1523]: 2025-03-25 02:01:23.318 [INFO][4686] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b452acd9-0dc2-4c9b-905e-864aa551f46e", ResourceVersion:"579", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-rfmq7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.116.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid0404421668", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:23.371002 containerd[1523]: 2025-03-25 02:01:23.318 [INFO][4686] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.70/32] ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.371002 containerd[1523]: 2025-03-25 02:01:23.318 [INFO][4686] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid0404421668 ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.371002 containerd[1523]: 2025-03-25 02:01:23.325 [INFO][4686] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.371002 containerd[1523]: 2025-03-25 02:01:23.327 [INFO][4686] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b452acd9-0dc2-4c9b-905e-864aa551f46e", ResourceVersion:"579", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 0, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2", Pod:"csi-node-driver-rfmq7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.116.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid0404421668", MAC:"a6:be:78:e0:5d:5b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:23.371002 containerd[1523]: 2025-03-25 02:01:23.360 [INFO][4686] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" Namespace="calico-system" Pod="csi-node-driver-rfmq7" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-csi--node--driver--rfmq7-eth0" Mar 25 02:01:23.469580 containerd[1523]: time="2025-03-25T02:01:23.469477867Z" level=info msg="connecting to shim 6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2" address="unix:///run/containerd/s/4a3010199744881de5e9a73f7639a486f75af33ef2c66b091f8d3952685a50c2" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:23.526809 systemd[1]: Started cri-containerd-6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2.scope - libcontainer container 6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2. Mar 25 02:01:23.592766 containerd[1523]: time="2025-03-25T02:01:23.592702695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rfmq7,Uid:b452acd9-0dc2-4c9b-905e-864aa551f46e,Namespace:calico-system,Attempt:0,} returns sandbox id \"6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2\"" Mar 25 02:01:24.959769 containerd[1523]: time="2025-03-25T02:01:24.959656887Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:24.960932 containerd[1523]: time="2025-03-25T02:01:24.960524772Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 25 02:01:24.962070 containerd[1523]: time="2025-03-25T02:01:24.962030323Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:24.965732 containerd[1523]: time="2025-03-25T02:01:24.965675202Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:24.966958 containerd[1523]: time="2025-03-25T02:01:24.966738333Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.574908466s" Mar 25 02:01:24.966958 containerd[1523]: time="2025-03-25T02:01:24.966782736Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 25 02:01:24.969280 containerd[1523]: time="2025-03-25T02:01:24.969254868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 02:01:24.999190 containerd[1523]: time="2025-03-25T02:01:24.996706146Z" level=info msg="CreateContainer within sandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 02:01:25.008565 containerd[1523]: time="2025-03-25T02:01:25.006969754Z" level=info msg="Container e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:25.019934 containerd[1523]: time="2025-03-25T02:01:25.019858068Z" level=info msg="CreateContainer within sandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\"" Mar 25 02:01:25.023112 containerd[1523]: time="2025-03-25T02:01:25.021798666Z" level=info msg="StartContainer for \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\"" Mar 25 02:01:25.023888 containerd[1523]: time="2025-03-25T02:01:25.023855274Z" level=info msg="connecting to shim e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0" address="unix:///run/containerd/s/8b55140683536c8243565411ee2489c2dd979343712ae1853a6d8bda9a2fda0b" protocol=ttrpc version=3 Mar 25 02:01:25.086792 systemd[1]: Started cri-containerd-e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0.scope - libcontainer container e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0. Mar 25 02:01:25.098772 systemd-networkd[1448]: calid0404421668: Gained IPv6LL Mar 25 02:01:25.178809 containerd[1523]: time="2025-03-25T02:01:25.178756195Z" level=info msg="StartContainer for \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" returns successfully" Mar 25 02:01:25.812437 kubelet[2782]: I0325 02:01:25.810529 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5b9d7665cb-7crdx" podStartSLOduration=47.232351718 podStartE2EDuration="50.810448895s" podCreationTimestamp="2025-03-25 02:00:35 +0000 UTC" firstStartedPulling="2025-03-25 02:01:21.39060554 +0000 UTC m=+58.755572803" lastFinishedPulling="2025-03-25 02:01:24.968702723 +0000 UTC m=+62.333669980" observedRunningTime="2025-03-25 02:01:25.808794105 +0000 UTC m=+63.173761367" watchObservedRunningTime="2025-03-25 02:01:25.810448895 +0000 UTC m=+63.175416140" Mar 25 02:01:25.884088 containerd[1523]: time="2025-03-25T02:01:25.884017340Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" id:\"2129359fa6adc8d3a998f83bd2de1134cccde6c437baa42ca37a50180e4527df\" pid:4817 exited_at:{seconds:1742868085 nanos:883318992}" Mar 25 02:01:26.952160 containerd[1523]: time="2025-03-25T02:01:26.951977547Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:26.954015 containerd[1523]: time="2025-03-25T02:01:26.953911236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 25 02:01:26.955141 containerd[1523]: time="2025-03-25T02:01:26.955044379Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:26.959342 containerd[1523]: time="2025-03-25T02:01:26.957881974Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:26.959342 containerd[1523]: time="2025-03-25T02:01:26.958995062Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 1.98955729s" Mar 25 02:01:26.959342 containerd[1523]: time="2025-03-25T02:01:26.959058153Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 25 02:01:26.965879 containerd[1523]: time="2025-03-25T02:01:26.965838643Z" level=info msg="CreateContainer within sandbox \"6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 02:01:26.987934 containerd[1523]: time="2025-03-25T02:01:26.987872374Z" level=info msg="Container 97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:26.995309 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3650197855.mount: Deactivated successfully. Mar 25 02:01:27.005626 containerd[1523]: time="2025-03-25T02:01:27.005173065Z" level=info msg="CreateContainer within sandbox \"6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893\"" Mar 25 02:01:27.006255 containerd[1523]: time="2025-03-25T02:01:27.006174886Z" level=info msg="StartContainer for \"97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893\"" Mar 25 02:01:27.009991 containerd[1523]: time="2025-03-25T02:01:27.009849761Z" level=info msg="connecting to shim 97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893" address="unix:///run/containerd/s/4a3010199744881de5e9a73f7639a486f75af33ef2c66b091f8d3952685a50c2" protocol=ttrpc version=3 Mar 25 02:01:27.049815 systemd[1]: Started cri-containerd-97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893.scope - libcontainer container 97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893. Mar 25 02:01:27.281856 containerd[1523]: time="2025-03-25T02:01:27.278423862Z" level=info msg="StartContainer for \"97562188d7e7290534fec86a966e812946340f26a11e7d1fae2f956a877e6893\" returns successfully" Mar 25 02:01:27.288556 containerd[1523]: time="2025-03-25T02:01:27.288505476Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 02:01:29.406756 containerd[1523]: time="2025-03-25T02:01:29.404836308Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:29.406756 containerd[1523]: time="2025-03-25T02:01:29.406040521Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 25 02:01:29.407816 containerd[1523]: time="2025-03-25T02:01:29.407606434Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:29.411109 containerd[1523]: time="2025-03-25T02:01:29.411061656Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 02:01:29.412673 containerd[1523]: time="2025-03-25T02:01:29.412639183Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.123876798s" Mar 25 02:01:29.413486 containerd[1523]: time="2025-03-25T02:01:29.413456878Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 25 02:01:29.419907 containerd[1523]: time="2025-03-25T02:01:29.419869793Z" level=info msg="CreateContainer within sandbox \"6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 02:01:29.431788 containerd[1523]: time="2025-03-25T02:01:29.431754601Z" level=info msg="Container 523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:29.466152 containerd[1523]: time="2025-03-25T02:01:29.466071011Z" level=info msg="CreateContainer within sandbox \"6733b8ffd622883b45506ace0fe8d347347a40e2603a9cef76895567365862d2\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28\"" Mar 25 02:01:29.468139 containerd[1523]: time="2025-03-25T02:01:29.468096041Z" level=info msg="StartContainer for \"523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28\"" Mar 25 02:01:29.471495 containerd[1523]: time="2025-03-25T02:01:29.471456535Z" level=info msg="connecting to shim 523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28" address="unix:///run/containerd/s/4a3010199744881de5e9a73f7639a486f75af33ef2c66b091f8d3952685a50c2" protocol=ttrpc version=3 Mar 25 02:01:29.515858 systemd[1]: Started cri-containerd-523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28.scope - libcontainer container 523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28. Mar 25 02:01:29.603123 containerd[1523]: time="2025-03-25T02:01:29.602963919Z" level=info msg="StartContainer for \"523048a36c3d4b75f6c8b137b26e8a44e866601c6c984ea9481f74038fad4b28\" returns successfully" Mar 25 02:01:29.862929 kubelet[2782]: I0325 02:01:29.862550 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-rfmq7" podStartSLOduration=50.045540049 podStartE2EDuration="55.86239153s" podCreationTimestamp="2025-03-25 02:00:34 +0000 UTC" firstStartedPulling="2025-03-25 02:01:23.598446299 +0000 UTC m=+60.963413545" lastFinishedPulling="2025-03-25 02:01:29.415297774 +0000 UTC m=+66.780265026" observedRunningTime="2025-03-25 02:01:29.852943495 +0000 UTC m=+67.217910756" watchObservedRunningTime="2025-03-25 02:01:29.86239153 +0000 UTC m=+67.227358811" Mar 25 02:01:30.376737 kubelet[2782]: I0325 02:01:30.376530 2782 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 02:01:30.386013 kubelet[2782]: I0325 02:01:30.385859 2782 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 02:01:35.662278 containerd[1523]: time="2025-03-25T02:01:35.662173833Z" level=info msg="StopContainer for \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" with timeout 300 (s)" Mar 25 02:01:35.673620 containerd[1523]: time="2025-03-25T02:01:35.673534657Z" level=info msg="Stop container \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" with signal terminated" Mar 25 02:01:36.170318 containerd[1523]: time="2025-03-25T02:01:36.170156604Z" level=info msg="StopContainer for \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" with timeout 30 (s)" Mar 25 02:01:36.173923 containerd[1523]: time="2025-03-25T02:01:36.173473724Z" level=info msg="Stop container \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" with signal terminated" Mar 25 02:01:36.206103 systemd[1]: cri-containerd-e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0.scope: Deactivated successfully. Mar 25 02:01:36.225946 containerd[1523]: time="2025-03-25T02:01:36.225857086Z" level=info msg="received exit event container_id:\"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" id:\"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" pid:4788 exit_status:2 exited_at:{seconds:1742868096 nanos:225331511}" Mar 25 02:01:36.226754 containerd[1523]: time="2025-03-25T02:01:36.226434207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" id:\"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" pid:4788 exit_status:2 exited_at:{seconds:1742868096 nanos:225331511}" Mar 25 02:01:36.311003 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0-rootfs.mount: Deactivated successfully. Mar 25 02:01:36.461807 containerd[1523]: time="2025-03-25T02:01:36.461397463Z" level=info msg="StopContainer for \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" returns successfully" Mar 25 02:01:36.484872 containerd[1523]: time="2025-03-25T02:01:36.484014407Z" level=info msg="StopPodSandbox for \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\"" Mar 25 02:01:36.521458 containerd[1523]: time="2025-03-25T02:01:36.521131645Z" level=info msg="Container to stop \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 02:01:36.529374 containerd[1523]: time="2025-03-25T02:01:36.529307778Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"1ef7bf9e90451aae36ca3c4d037d4a2bdd6cad4d01f3cac744091874b2d0f7a6\" pid:4950 exited_at:{seconds:1742868096 nanos:527101507}" Mar 25 02:01:36.547769 systemd[1]: cri-containerd-0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0.scope: Deactivated successfully. Mar 25 02:01:36.557551 containerd[1523]: time="2025-03-25T02:01:36.557453070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" id:\"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" pid:4665 exit_status:137 exited_at:{seconds:1742868096 nanos:556917893}" Mar 25 02:01:36.640949 containerd[1523]: time="2025-03-25T02:01:36.640898522Z" level=info msg="StopContainer for \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" with timeout 5 (s)" Mar 25 02:01:36.642243 containerd[1523]: time="2025-03-25T02:01:36.642208669Z" level=info msg="Stop container \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" with signal terminated" Mar 25 02:01:36.654927 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0-rootfs.mount: Deactivated successfully. Mar 25 02:01:36.706485 containerd[1523]: time="2025-03-25T02:01:36.706433713Z" level=info msg="shim disconnected" id=0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0 namespace=k8s.io Mar 25 02:01:36.707235 containerd[1523]: time="2025-03-25T02:01:36.707196888Z" level=warning msg="cleaning up after shim disconnected" id=0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0 namespace=k8s.io Mar 25 02:01:36.726123 systemd[1]: cri-containerd-b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27.scope: Deactivated successfully. Mar 25 02:01:36.726725 systemd[1]: cri-containerd-b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27.scope: Consumed 2.954s CPU time, 187.5M memory peak, 20.4M read from disk, 664K written to disk. Mar 25 02:01:36.739773 containerd[1523]: time="2025-03-25T02:01:36.707414198Z" level=error msg="failed sending message on channel" error="write unix /run/containerd/s/8b55140683536c8243565411ee2489c2dd979343712ae1853a6d8bda9a2fda0b->@: write: broken pipe" runtime=io.containerd.runc.v2 Mar 25 02:01:36.740047 containerd[1523]: time="2025-03-25T02:01:36.707341518Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 25 02:01:36.741564 containerd[1523]: time="2025-03-25T02:01:36.741495097Z" level=info msg="received exit event container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" pid:3741 exited_at:{seconds:1742868096 nanos:736850417}" Mar 25 02:01:36.835514 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27-rootfs.mount: Deactivated successfully. Mar 25 02:01:36.873760 containerd[1523]: time="2025-03-25T02:01:36.872667392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"25e4d6b1809b7346c842a89a32e6d7484f40f05e474bd40c7fc876e086a7a7bb\" pid:4939 exited_at:{seconds:1742868096 nanos:630464590}" Mar 25 02:01:36.874409 containerd[1523]: time="2025-03-25T02:01:36.874341041Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" id:\"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" pid:3741 exited_at:{seconds:1742868096 nanos:736850417}" Mar 25 02:01:36.885959 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0-shm.mount: Deactivated successfully. Mar 25 02:01:36.892232 containerd[1523]: time="2025-03-25T02:01:36.892176685Z" level=info msg="StopContainer for \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" returns successfully" Mar 25 02:01:36.905513 containerd[1523]: time="2025-03-25T02:01:36.904007614Z" level=info msg="StopPodSandbox for \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\"" Mar 25 02:01:36.905513 containerd[1523]: time="2025-03-25T02:01:36.904211131Z" level=info msg="Container to stop \"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 02:01:36.905513 containerd[1523]: time="2025-03-25T02:01:36.904273997Z" level=info msg="Container to stop \"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 02:01:36.905513 containerd[1523]: time="2025-03-25T02:01:36.904295057Z" level=info msg="Container to stop \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 02:01:36.929766 containerd[1523]: time="2025-03-25T02:01:36.929676601Z" level=info msg="received exit event sandbox_id:\"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" exit_status:137 exited_at:{seconds:1742868096 nanos:556917893}" Mar 25 02:01:36.943679 systemd[1]: cri-containerd-f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd.scope: Deactivated successfully. Mar 25 02:01:36.950994 containerd[1523]: time="2025-03-25T02:01:36.950747978Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" id:\"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" pid:3302 exit_status:137 exited_at:{seconds:1742868096 nanos:950272620}" Mar 25 02:01:37.031311 containerd[1523]: time="2025-03-25T02:01:37.030334183Z" level=info msg="received exit event sandbox_id:\"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" exit_status:137 exited_at:{seconds:1742868096 nanos:950272620}" Mar 25 02:01:37.035010 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd-rootfs.mount: Deactivated successfully. Mar 25 02:01:37.042072 containerd[1523]: time="2025-03-25T02:01:37.040451910Z" level=info msg="shim disconnected" id=f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd namespace=k8s.io Mar 25 02:01:37.042072 containerd[1523]: time="2025-03-25T02:01:37.040634662Z" level=warning msg="cleaning up after shim disconnected" id=f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd namespace=k8s.io Mar 25 02:01:37.042072 containerd[1523]: time="2025-03-25T02:01:37.040651743Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 25 02:01:37.050653 containerd[1523]: time="2025-03-25T02:01:37.050593942Z" level=info msg="TearDown network for sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" successfully" Mar 25 02:01:37.050836 containerd[1523]: time="2025-03-25T02:01:37.050679689Z" level=info msg="StopPodSandbox for \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" returns successfully" Mar 25 02:01:37.177291 kubelet[2782]: I0325 02:01:37.176570 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgvg5\" (UniqueName: \"kubernetes.io/projected/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-kube-api-access-pgvg5\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.177291 kubelet[2782]: I0325 02:01:37.176725 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-tigera-ca-bundle\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.177291 kubelet[2782]: I0325 02:01:37.176764 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-xtables-lock\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.177291 kubelet[2782]: I0325 02:01:37.176819 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-node-certs\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.177291 kubelet[2782]: I0325 02:01:37.176863 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-lib-modules\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.177291 kubelet[2782]: I0325 02:01:37.176895 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-log-dir\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.176925 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-bin-dir\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.176951 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-net-dir\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.176979 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-flexvol-driver-host\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.177033 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-run-calico\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.177066 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-lib-calico\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.177093 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-policysync\") pod \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\" (UID: \"aea98d1b-d4bf-4e73-9205-d458c40ca1c2\") " Mar 25 02:01:37.179055 kubelet[2782]: I0325 02:01:37.178590 2782 memory_manager.go:355] "RemoveStaleState removing state" podUID="aea98d1b-d4bf-4e73-9205-d458c40ca1c2" containerName="calico-node" Mar 25 02:01:37.190559 kubelet[2782]: I0325 02:01:37.181657 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-policysync" (OuterVolumeSpecName: "policysync") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.199927 kubelet[2782]: I0325 02:01:37.193269 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.214930 kubelet[2782]: I0325 02:01:37.214835 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.215145 kubelet[2782]: I0325 02:01:37.214962 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.215145 kubelet[2782]: I0325 02:01:37.215015 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.215145 kubelet[2782]: I0325 02:01:37.215063 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.215145 kubelet[2782]: I0325 02:01:37.215123 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.227127 systemd-networkd[1448]: cali99a7b52eb6a: Link DOWN Mar 25 02:01:37.227140 systemd-networkd[1448]: cali99a7b52eb6a: Lost carrier Mar 25 02:01:37.242703 kubelet[2782]: I0325 02:01:37.231367 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.249569 kubelet[2782]: I0325 02:01:37.248821 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGIDValue "" Mar 25 02:01:37.279348 kubelet[2782]: I0325 02:01:37.278307 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-cni-bin-dir\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279348 kubelet[2782]: I0325 02:01:37.278394 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-var-run-calico\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279348 kubelet[2782]: I0325 02:01:37.278435 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-cni-net-dir\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279348 kubelet[2782]: I0325 02:01:37.278481 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/08f4aaef-0ffd-4aa0-b727-e60403b631fb-node-certs\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279348 kubelet[2782]: I0325 02:01:37.278569 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcg9r\" (UniqueName: \"kubernetes.io/projected/08f4aaef-0ffd-4aa0-b727-e60403b631fb-kube-api-access-mcg9r\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279839 kubelet[2782]: I0325 02:01:37.278624 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08f4aaef-0ffd-4aa0-b727-e60403b631fb-tigera-ca-bundle\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279839 kubelet[2782]: I0325 02:01:37.278696 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-policysync\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279839 kubelet[2782]: I0325 02:01:37.278729 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-lib-modules\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279839 kubelet[2782]: I0325 02:01:37.278819 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-flexvol-driver-host\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.279839 kubelet[2782]: I0325 02:01:37.278877 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-var-lib-calico\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.278932 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-cni-log-dir\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.278968 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/08f4aaef-0ffd-4aa0-b727-e60403b631fb-xtables-lock\") pod \"calico-node-v7ngd\" (UID: \"08f4aaef-0ffd-4aa0-b727-e60403b631fb\") " pod="calico-system/calico-node-v7ngd" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.279027 2782 reconciler_common.go:299] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-xtables-lock\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.279053 2782 reconciler_common.go:299] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-lib-modules\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.279074 2782 reconciler_common.go:299] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-bin-dir\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.279091 2782 reconciler_common.go:299] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-net-dir\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281208 kubelet[2782]: I0325 02:01:37.279107 2782 reconciler_common.go:299] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-flexvol-driver-host\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281720 kubelet[2782]: I0325 02:01:37.279130 2782 reconciler_common.go:299] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-cni-log-dir\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281720 kubelet[2782]: I0325 02:01:37.279145 2782 reconciler_common.go:299] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-lib-calico\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281720 kubelet[2782]: I0325 02:01:37.279159 2782 reconciler_common.go:299] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-policysync\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.281720 kubelet[2782]: I0325 02:01:37.279182 2782 reconciler_common.go:299] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-var-run-calico\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.286622 kubelet[2782]: I0325 02:01:37.286581 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 25 02:01:37.294310 systemd[1]: Created slice kubepods-besteffort-pod08f4aaef_0ffd_4aa0_b727_e60403b631fb.slice - libcontainer container kubepods-besteffort-pod08f4aaef_0ffd_4aa0_b727_e60403b631fb.slice. Mar 25 02:01:37.304733 kubelet[2782]: I0325 02:01:37.304658 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-node-certs" (OuterVolumeSpecName: "node-certs") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 25 02:01:37.305992 kubelet[2782]: I0325 02:01:37.305929 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-kube-api-access-pgvg5" (OuterVolumeSpecName: "kube-api-access-pgvg5") pod "aea98d1b-d4bf-4e73-9205-d458c40ca1c2" (UID: "aea98d1b-d4bf-4e73-9205-d458c40ca1c2"). InnerVolumeSpecName "kube-api-access-pgvg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 25 02:01:37.316258 systemd[1]: var-lib-kubelet-pods-aea98d1b\x2dd4bf\x2d4e73\x2d9205\x2dd458c40ca1c2-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 25 02:01:37.316464 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd-shm.mount: Deactivated successfully. Mar 25 02:01:37.316610 systemd[1]: var-lib-kubelet-pods-aea98d1b\x2dd4bf\x2d4e73\x2d9205\x2dd458c40ca1c2-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dpgvg5.mount: Deactivated successfully. Mar 25 02:01:37.316730 systemd[1]: var-lib-kubelet-pods-aea98d1b\x2dd4bf\x2d4e73\x2d9205\x2dd458c40ca1c2-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 25 02:01:37.383879 kubelet[2782]: I0325 02:01:37.383299 2782 reconciler_common.go:299] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-tigera-ca-bundle\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.383879 kubelet[2782]: I0325 02:01:37.383371 2782 reconciler_common.go:299] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-node-certs\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.383879 kubelet[2782]: I0325 02:01:37.383394 2782 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgvg5\" (UniqueName: \"kubernetes.io/projected/aea98d1b-d4bf-4e73-9205-d458c40ca1c2-kube-api-access-pgvg5\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.213 [INFO][5073] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.218 [INFO][5073] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" iface="eth0" netns="/var/run/netns/cni-f0b1b38a-8ce1-ba67-15dd-66810893e5ac" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.223 [INFO][5073] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" iface="eth0" netns="/var/run/netns/cni-f0b1b38a-8ce1-ba67-15dd-66810893e5ac" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.242 [INFO][5073] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" after=23.32199ms iface="eth0" netns="/var/run/netns/cni-f0b1b38a-8ce1-ba67-15dd-66810893e5ac" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.242 [INFO][5073] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.242 [INFO][5073] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.370 [INFO][5107] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.372 [INFO][5107] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.372 [INFO][5107] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.453 [INFO][5107] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.453 [INFO][5107] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.456 [INFO][5107] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:37.462335 containerd[1523]: 2025-03-25 02:01:37.459 [INFO][5073] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:01:37.465626 containerd[1523]: time="2025-03-25T02:01:37.465196309Z" level=info msg="TearDown network for sandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" successfully" Mar 25 02:01:37.465626 containerd[1523]: time="2025-03-25T02:01:37.465290275Z" level=info msg="StopPodSandbox for \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" returns successfully" Mar 25 02:01:37.470693 systemd[1]: run-netns-cni\x2df0b1b38a\x2d8ce1\x2dba67\x2d15dd\x2d66810893e5ac.mount: Deactivated successfully. Mar 25 02:01:37.585965 kubelet[2782]: I0325 02:01:37.585155 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcsq\" (UniqueName: \"kubernetes.io/projected/760bc0b6-71e1-4be3-8d8e-ceb103060e98-kube-api-access-qmcsq\") pod \"760bc0b6-71e1-4be3-8d8e-ceb103060e98\" (UID: \"760bc0b6-71e1-4be3-8d8e-ceb103060e98\") " Mar 25 02:01:37.586843 kubelet[2782]: I0325 02:01:37.586227 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760bc0b6-71e1-4be3-8d8e-ceb103060e98-tigera-ca-bundle\") pod \"760bc0b6-71e1-4be3-8d8e-ceb103060e98\" (UID: \"760bc0b6-71e1-4be3-8d8e-ceb103060e98\") " Mar 25 02:01:37.596347 systemd[1]: var-lib-kubelet-pods-760bc0b6\x2d71e1\x2d4be3\x2d8d8e\x2dceb103060e98-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 25 02:01:37.596988 kubelet[2782]: I0325 02:01:37.596239 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/760bc0b6-71e1-4be3-8d8e-ceb103060e98-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "760bc0b6-71e1-4be3-8d8e-ceb103060e98" (UID: "760bc0b6-71e1-4be3-8d8e-ceb103060e98"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 25 02:01:37.601148 kubelet[2782]: I0325 02:01:37.600992 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760bc0b6-71e1-4be3-8d8e-ceb103060e98-kube-api-access-qmcsq" (OuterVolumeSpecName: "kube-api-access-qmcsq") pod "760bc0b6-71e1-4be3-8d8e-ceb103060e98" (UID: "760bc0b6-71e1-4be3-8d8e-ceb103060e98"). InnerVolumeSpecName "kube-api-access-qmcsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 25 02:01:37.613157 containerd[1523]: time="2025-03-25T02:01:37.612994377Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v7ngd,Uid:08f4aaef-0ffd-4aa0-b727-e60403b631fb,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:37.647516 containerd[1523]: time="2025-03-25T02:01:37.647455605Z" level=info msg="connecting to shim 400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107" address="unix:///run/containerd/s/6b4ddcd07646daf4168aadb3caf8fa4349e7a8e47d0c29ab45d00121f4d20f6e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:37.678777 systemd[1]: Started cri-containerd-400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107.scope - libcontainer container 400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107. Mar 25 02:01:37.687788 kubelet[2782]: I0325 02:01:37.687737 2782 reconciler_common.go:299] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/760bc0b6-71e1-4be3-8d8e-ceb103060e98-tigera-ca-bundle\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.688126 kubelet[2782]: I0325 02:01:37.688092 2782 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmcsq\" (UniqueName: \"kubernetes.io/projected/760bc0b6-71e1-4be3-8d8e-ceb103060e98-kube-api-access-qmcsq\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:37.698790 systemd[1]: cri-containerd-eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82.scope: Deactivated successfully. Mar 25 02:01:37.699251 systemd[1]: cri-containerd-eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82.scope: Consumed 451ms CPU time, 36.9M memory peak, 26M read from disk. Mar 25 02:01:37.707711 containerd[1523]: time="2025-03-25T02:01:37.707518331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" id:\"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" pid:3395 exit_status:1 exited_at:{seconds:1742868097 nanos:705860431}" Mar 25 02:01:37.708462 containerd[1523]: time="2025-03-25T02:01:37.707800788Z" level=info msg="received exit event container_id:\"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" id:\"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" pid:3395 exit_status:1 exited_at:{seconds:1742868097 nanos:705860431}" Mar 25 02:01:37.742268 containerd[1523]: time="2025-03-25T02:01:37.742102641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v7ngd,Uid:08f4aaef-0ffd-4aa0-b727-e60403b631fb,Namespace:calico-system,Attempt:0,} returns sandbox id \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\"" Mar 25 02:01:37.755369 containerd[1523]: time="2025-03-25T02:01:37.755101945Z" level=info msg="CreateContainer within sandbox \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 02:01:37.798283 containerd[1523]: time="2025-03-25T02:01:37.798204828Z" level=info msg="Container f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:37.819475 containerd[1523]: time="2025-03-25T02:01:37.818974727Z" level=info msg="StopContainer for \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" returns successfully" Mar 25 02:01:37.824039 containerd[1523]: time="2025-03-25T02:01:37.823368555Z" level=info msg="CreateContainer within sandbox \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\"" Mar 25 02:01:37.826092 containerd[1523]: time="2025-03-25T02:01:37.824788566Z" level=info msg="StartContainer for \"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\"" Mar 25 02:01:37.830202 containerd[1523]: time="2025-03-25T02:01:37.829556499Z" level=info msg="connecting to shim f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee" address="unix:///run/containerd/s/6b4ddcd07646daf4168aadb3caf8fa4349e7a8e47d0c29ab45d00121f4d20f6e" protocol=ttrpc version=3 Mar 25 02:01:37.832267 containerd[1523]: time="2025-03-25T02:01:37.831759289Z" level=info msg="StopPodSandbox for \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\"" Mar 25 02:01:37.832267 containerd[1523]: time="2025-03-25T02:01:37.831873124Z" level=info msg="Container to stop \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 25 02:01:37.875741 systemd[1]: Started cri-containerd-f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee.scope - libcontainer container f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee. Mar 25 02:01:37.889767 systemd[1]: cri-containerd-5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493.scope: Deactivated successfully. Mar 25 02:01:37.893379 containerd[1523]: time="2025-03-25T02:01:37.893320160Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" id:\"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" pid:3266 exit_status:137 exited_at:{seconds:1742868097 nanos:892283020}" Mar 25 02:01:37.950317 containerd[1523]: time="2025-03-25T02:01:37.950266279Z" level=info msg="shim disconnected" id=5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493 namespace=k8s.io Mar 25 02:01:37.950875 containerd[1523]: time="2025-03-25T02:01:37.950844216Z" level=warning msg="cleaning up after shim disconnected" id=5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493 namespace=k8s.io Mar 25 02:01:37.951070 containerd[1523]: time="2025-03-25T02:01:37.950871444Z" level=info msg="cleaning up dead shim" namespace=k8s.io Mar 25 02:01:37.975621 kubelet[2782]: I0325 02:01:37.975385 2782 scope.go:117] "RemoveContainer" containerID="b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27" Mar 25 02:01:38.005621 systemd[1]: Removed slice kubepods-besteffort-podaea98d1b_d4bf_4e73_9205_d458c40ca1c2.slice - libcontainer container kubepods-besteffort-podaea98d1b_d4bf_4e73_9205_d458c40ca1c2.slice. Mar 25 02:01:38.005879 systemd[1]: kubepods-besteffort-podaea98d1b_d4bf_4e73_9205_d458c40ca1c2.slice: Consumed 3.904s CPU time, 198.4M memory peak, 23.7M read from disk, 161M written to disk. Mar 25 02:01:38.012554 containerd[1523]: time="2025-03-25T02:01:38.011517320Z" level=info msg="RemoveContainer for \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\"" Mar 25 02:01:38.067849 containerd[1523]: time="2025-03-25T02:01:38.067691457Z" level=info msg="RemoveContainer for \"b681e296c2c7d8e71258e5bf85c935000084154d14057828e5dbb31b156b7a27\" returns successfully" Mar 25 02:01:38.082672 kubelet[2782]: I0325 02:01:38.082612 2782 scope.go:117] "RemoveContainer" containerID="a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e" Mar 25 02:01:38.097887 systemd[1]: Removed slice kubepods-besteffort-pod760bc0b6_71e1_4be3_8d8e_ceb103060e98.slice - libcontainer container kubepods-besteffort-pod760bc0b6_71e1_4be3_8d8e_ceb103060e98.slice. Mar 25 02:01:38.108975 containerd[1523]: time="2025-03-25T02:01:38.108921152Z" level=info msg="received exit event sandbox_id:\"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" exit_status:137 exited_at:{seconds:1742868097 nanos:892283020}" Mar 25 02:01:38.110016 containerd[1523]: time="2025-03-25T02:01:38.109936369Z" level=info msg="RemoveContainer for \"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\"" Mar 25 02:01:38.112095 containerd[1523]: time="2025-03-25T02:01:38.112061016Z" level=info msg="TearDown network for sandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" successfully" Mar 25 02:01:38.112274 containerd[1523]: time="2025-03-25T02:01:38.112096131Z" level=info msg="StopPodSandbox for \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" returns successfully" Mar 25 02:01:38.127639 containerd[1523]: time="2025-03-25T02:01:38.126689344Z" level=info msg="RemoveContainer for \"a11a3f54f322f7a59f7b7c7ffcfbd3f990ee4b81468f3e3044ad98fb73bc031e\" returns successfully" Mar 25 02:01:38.127639 containerd[1523]: time="2025-03-25T02:01:38.127145102Z" level=info msg="StartContainer for \"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\" returns successfully" Mar 25 02:01:38.133131 kubelet[2782]: I0325 02:01:38.131640 2782 scope.go:117] "RemoveContainer" containerID="dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1" Mar 25 02:01:38.139568 containerd[1523]: time="2025-03-25T02:01:38.138624777Z" level=info msg="RemoveContainer for \"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\"" Mar 25 02:01:38.150464 containerd[1523]: time="2025-03-25T02:01:38.149826110Z" level=info msg="RemoveContainer for \"dff683985887377d70139100e828c07a95c3cbedf9ace74761f1c2b5f9bf9fe1\" returns successfully" Mar 25 02:01:38.151561 kubelet[2782]: I0325 02:01:38.151497 2782 scope.go:117] "RemoveContainer" containerID="e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0" Mar 25 02:01:38.155151 containerd[1523]: time="2025-03-25T02:01:38.155109571Z" level=info msg="RemoveContainer for \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\"" Mar 25 02:01:38.167595 containerd[1523]: time="2025-03-25T02:01:38.167386325Z" level=info msg="RemoveContainer for \"e603885d836e236797185a6ec501928bc706d234b4b84102637fe1f3a0f6adf0\" returns successfully" Mar 25 02:01:38.193895 kubelet[2782]: I0325 02:01:38.193244 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qxc5\" (UniqueName: \"kubernetes.io/projected/d8604123-c470-4c9c-bc8f-3853a64c4faf-kube-api-access-8qxc5\") pod \"d8604123-c470-4c9c-bc8f-3853a64c4faf\" (UID: \"d8604123-c470-4c9c-bc8f-3853a64c4faf\") " Mar 25 02:01:38.193895 kubelet[2782]: I0325 02:01:38.193316 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d8604123-c470-4c9c-bc8f-3853a64c4faf-typha-certs\") pod \"d8604123-c470-4c9c-bc8f-3853a64c4faf\" (UID: \"d8604123-c470-4c9c-bc8f-3853a64c4faf\") " Mar 25 02:01:38.193895 kubelet[2782]: I0325 02:01:38.193377 2782 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8604123-c470-4c9c-bc8f-3853a64c4faf-tigera-ca-bundle\") pod \"d8604123-c470-4c9c-bc8f-3853a64c4faf\" (UID: \"d8604123-c470-4c9c-bc8f-3853a64c4faf\") " Mar 25 02:01:38.203188 kubelet[2782]: I0325 02:01:38.203107 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8604123-c470-4c9c-bc8f-3853a64c4faf-kube-api-access-8qxc5" (OuterVolumeSpecName: "kube-api-access-8qxc5") pod "d8604123-c470-4c9c-bc8f-3853a64c4faf" (UID: "d8604123-c470-4c9c-bc8f-3853a64c4faf"). InnerVolumeSpecName "kube-api-access-8qxc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 25 02:01:38.206688 kubelet[2782]: I0325 02:01:38.206636 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8604123-c470-4c9c-bc8f-3853a64c4faf-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "d8604123-c470-4c9c-bc8f-3853a64c4faf" (UID: "d8604123-c470-4c9c-bc8f-3853a64c4faf"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 25 02:01:38.213850 kubelet[2782]: I0325 02:01:38.213793 2782 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8604123-c470-4c9c-bc8f-3853a64c4faf-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "d8604123-c470-4c9c-bc8f-3853a64c4faf" (UID: "d8604123-c470-4c9c-bc8f-3853a64c4faf"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 25 02:01:38.282525 systemd[1]: cri-containerd-f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee.scope: Deactivated successfully. Mar 25 02:01:38.283031 systemd[1]: cri-containerd-f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee.scope: Consumed 73ms CPU time, 18.6M memory peak, 10.8M read from disk, 6.3M written to disk. Mar 25 02:01:38.287584 containerd[1523]: time="2025-03-25T02:01:38.287424125Z" level=info msg="received exit event container_id:\"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\" id:\"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\" pid:5197 exited_at:{seconds:1742868098 nanos:286456252}" Mar 25 02:01:38.287735 containerd[1523]: time="2025-03-25T02:01:38.287694804Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\" id:\"f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee\" pid:5197 exited_at:{seconds:1742868098 nanos:286456252}" Mar 25 02:01:38.295567 kubelet[2782]: I0325 02:01:38.294530 2782 reconciler_common.go:299] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8604123-c470-4c9c-bc8f-3853a64c4faf-tigera-ca-bundle\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:38.295567 kubelet[2782]: I0325 02:01:38.295073 2782 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8qxc5\" (UniqueName: \"kubernetes.io/projected/d8604123-c470-4c9c-bc8f-3853a64c4faf-kube-api-access-8qxc5\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:38.295567 kubelet[2782]: I0325 02:01:38.295107 2782 reconciler_common.go:299] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d8604123-c470-4c9c-bc8f-3853a64c4faf-typha-certs\") on node \"srv-u0apo.gb1.brightbox.com\" DevicePath \"\"" Mar 25 02:01:38.316391 systemd[1]: var-lib-kubelet-pods-760bc0b6\x2d71e1\x2d4be3\x2d8d8e\x2dceb103060e98-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dqmcsq.mount: Deactivated successfully. Mar 25 02:01:38.316574 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82-rootfs.mount: Deactivated successfully. Mar 25 02:01:38.316701 systemd[1]: var-lib-kubelet-pods-d8604123\x2dc470\x2d4c9c\x2dbc8f\x2d3853a64c4faf-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 25 02:01:38.316824 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493-rootfs.mount: Deactivated successfully. Mar 25 02:01:38.316924 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493-shm.mount: Deactivated successfully. Mar 25 02:01:38.317027 systemd[1]: var-lib-kubelet-pods-d8604123\x2dc470\x2d4c9c\x2dbc8f\x2d3853a64c4faf-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8qxc5.mount: Deactivated successfully. Mar 25 02:01:38.317151 systemd[1]: var-lib-kubelet-pods-d8604123\x2dc470\x2d4c9c\x2dbc8f\x2d3853a64c4faf-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 25 02:01:38.335781 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f649d640bb432b27208af9189c524a2e0df9fba4679ab7060c9a1f71e1aeeeee-rootfs.mount: Deactivated successfully. Mar 25 02:01:38.977014 kubelet[2782]: I0325 02:01:38.976658 2782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="760bc0b6-71e1-4be3-8d8e-ceb103060e98" path="/var/lib/kubelet/pods/760bc0b6-71e1-4be3-8d8e-ceb103060e98/volumes" Mar 25 02:01:38.977582 kubelet[2782]: I0325 02:01:38.977555 2782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea98d1b-d4bf-4e73-9205-d458c40ca1c2" path="/var/lib/kubelet/pods/aea98d1b-d4bf-4e73-9205-d458c40ca1c2/volumes" Mar 25 02:01:38.990181 systemd[1]: Removed slice kubepods-besteffort-podd8604123_c470_4c9c_bc8f_3853a64c4faf.slice - libcontainer container kubepods-besteffort-podd8604123_c470_4c9c_bc8f_3853a64c4faf.slice. Mar 25 02:01:38.990359 systemd[1]: kubepods-besteffort-podd8604123_c470_4c9c_bc8f_3853a64c4faf.slice: Consumed 498ms CPU time, 37.2M memory peak, 26M read from disk. Mar 25 02:01:39.128595 kubelet[2782]: I0325 02:01:39.126989 2782 scope.go:117] "RemoveContainer" containerID="eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82" Mar 25 02:01:39.137854 containerd[1523]: time="2025-03-25T02:01:39.137796423Z" level=info msg="RemoveContainer for \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\"" Mar 25 02:01:39.161918 containerd[1523]: time="2025-03-25T02:01:39.161195134Z" level=info msg="RemoveContainer for \"eff78994a9db9364864cd20dbdf298f324a0d739514e76292b08170e9f813c82\" returns successfully" Mar 25 02:01:39.161918 containerd[1523]: time="2025-03-25T02:01:39.161669786Z" level=info msg="CreateContainer within sandbox \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 02:01:39.192016 containerd[1523]: time="2025-03-25T02:01:39.190865063Z" level=info msg="Container 5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:39.215073 containerd[1523]: time="2025-03-25T02:01:39.214898799Z" level=info msg="CreateContainer within sandbox \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\"" Mar 25 02:01:39.217806 containerd[1523]: time="2025-03-25T02:01:39.216653731Z" level=info msg="StartContainer for \"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\"" Mar 25 02:01:39.220809 containerd[1523]: time="2025-03-25T02:01:39.220774527Z" level=info msg="connecting to shim 5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed" address="unix:///run/containerd/s/6b4ddcd07646daf4168aadb3caf8fa4349e7a8e47d0c29ab45d00121f4d20f6e" protocol=ttrpc version=3 Mar 25 02:01:39.253768 systemd[1]: Started cri-containerd-5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed.scope - libcontainer container 5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed. Mar 25 02:01:39.343811 containerd[1523]: time="2025-03-25T02:01:39.343755144Z" level=info msg="StartContainer for \"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\" returns successfully" Mar 25 02:01:39.423018 kubelet[2782]: I0325 02:01:39.422744 2782 memory_manager.go:355] "RemoveStaleState removing state" podUID="760bc0b6-71e1-4be3-8d8e-ceb103060e98" containerName="calico-kube-controllers" Mar 25 02:01:39.423018 kubelet[2782]: I0325 02:01:39.422798 2782 memory_manager.go:355] "RemoveStaleState removing state" podUID="d8604123-c470-4c9c-bc8f-3853a64c4faf" containerName="calico-typha" Mar 25 02:01:39.438314 systemd[1]: Created slice kubepods-besteffort-podd7ef8e5e_2665_4807_ae24_ac98377e410c.slice - libcontainer container kubepods-besteffort-podd7ef8e5e_2665_4807_ae24_ac98377e410c.slice. Mar 25 02:01:39.503278 kubelet[2782]: I0325 02:01:39.503016 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7ef8e5e-2665-4807-ae24-ac98377e410c-tigera-ca-bundle\") pod \"calico-typha-7cbd66c6f-jzmms\" (UID: \"d7ef8e5e-2665-4807-ae24-ac98377e410c\") " pod="calico-system/calico-typha-7cbd66c6f-jzmms" Mar 25 02:01:39.503278 kubelet[2782]: I0325 02:01:39.503080 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d7ef8e5e-2665-4807-ae24-ac98377e410c-typha-certs\") pod \"calico-typha-7cbd66c6f-jzmms\" (UID: \"d7ef8e5e-2665-4807-ae24-ac98377e410c\") " pod="calico-system/calico-typha-7cbd66c6f-jzmms" Mar 25 02:01:39.503278 kubelet[2782]: I0325 02:01:39.503134 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpkhw\" (UniqueName: \"kubernetes.io/projected/d7ef8e5e-2665-4807-ae24-ac98377e410c-kube-api-access-tpkhw\") pod \"calico-typha-7cbd66c6f-jzmms\" (UID: \"d7ef8e5e-2665-4807-ae24-ac98377e410c\") " pod="calico-system/calico-typha-7cbd66c6f-jzmms" Mar 25 02:01:39.746143 containerd[1523]: time="2025-03-25T02:01:39.745736238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cbd66c6f-jzmms,Uid:d7ef8e5e-2665-4807-ae24-ac98377e410c,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:39.791524 containerd[1523]: time="2025-03-25T02:01:39.789998078Z" level=info msg="connecting to shim 52023fa91fc0b36de299d51f7d87392ee8c0629af021c4754f4ed7751e53edf3" address="unix:///run/containerd/s/75a0d8de72207d806ee774121983fe811b2c188c1c60d642a21afec8afa2b171" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:39.842827 systemd[1]: Started cri-containerd-52023fa91fc0b36de299d51f7d87392ee8c0629af021c4754f4ed7751e53edf3.scope - libcontainer container 52023fa91fc0b36de299d51f7d87392ee8c0629af021c4754f4ed7751e53edf3. Mar 25 02:01:39.948374 containerd[1523]: time="2025-03-25T02:01:39.948314586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7cbd66c6f-jzmms,Uid:d7ef8e5e-2665-4807-ae24-ac98377e410c,Namespace:calico-system,Attempt:0,} returns sandbox id \"52023fa91fc0b36de299d51f7d87392ee8c0629af021c4754f4ed7751e53edf3\"" Mar 25 02:01:39.964906 containerd[1523]: time="2025-03-25T02:01:39.964840905Z" level=info msg="CreateContainer within sandbox \"52023fa91fc0b36de299d51f7d87392ee8c0629af021c4754f4ed7751e53edf3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 02:01:39.980508 containerd[1523]: time="2025-03-25T02:01:39.980234227Z" level=info msg="Container acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:39.993115 containerd[1523]: time="2025-03-25T02:01:39.993062361Z" level=info msg="CreateContainer within sandbox \"52023fa91fc0b36de299d51f7d87392ee8c0629af021c4754f4ed7751e53edf3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5\"" Mar 25 02:01:39.995602 containerd[1523]: time="2025-03-25T02:01:39.995197459Z" level=info msg="StartContainer for \"acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5\"" Mar 25 02:01:39.999065 containerd[1523]: time="2025-03-25T02:01:39.998473884Z" level=info msg="connecting to shim acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5" address="unix:///run/containerd/s/75a0d8de72207d806ee774121983fe811b2c188c1c60d642a21afec8afa2b171" protocol=ttrpc version=3 Mar 25 02:01:40.039219 systemd[1]: Started cri-containerd-acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5.scope - libcontainer container acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5. Mar 25 02:01:40.136652 containerd[1523]: time="2025-03-25T02:01:40.135618277Z" level=info msg="StartContainer for \"acb28c3882d8a36923e2573e91ce632ddf4643af1022b2a99c9ba992d37e4ff5\" returns successfully" Mar 25 02:01:40.977802 kubelet[2782]: I0325 02:01:40.977642 2782 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8604123-c470-4c9c-bc8f-3853a64c4faf" path="/var/lib/kubelet/pods/d8604123-c470-4c9c-bc8f-3853a64c4faf/volumes" Mar 25 02:01:41.213462 kubelet[2782]: I0325 02:01:41.213294 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7cbd66c6f-jzmms" podStartSLOduration=6.208681316 podStartE2EDuration="6.208681316s" podCreationTimestamp="2025-03-25 02:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:01:40.222303853 +0000 UTC m=+77.587271118" watchObservedRunningTime="2025-03-25 02:01:41.208681316 +0000 UTC m=+78.573648579" Mar 25 02:01:41.473361 systemd[1]: cri-containerd-5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed.scope: Deactivated successfully. Mar 25 02:01:41.474302 systemd[1]: cri-containerd-5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed.scope: Consumed 1.229s CPU time, 260.9M memory peak, 280.9M read from disk. Mar 25 02:01:41.476974 containerd[1523]: time="2025-03-25T02:01:41.476826891Z" level=info msg="received exit event container_id:\"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\" id:\"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\" pid:5282 exited_at:{seconds:1742868101 nanos:474638653}" Mar 25 02:01:41.477796 containerd[1523]: time="2025-03-25T02:01:41.477621743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\" id:\"5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed\" pid:5282 exited_at:{seconds:1742868101 nanos:474638653}" Mar 25 02:01:41.513316 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5ba211100e32f4a3914d45b46642567f61268411bdbdafdb1d073a20e52d51ed-rootfs.mount: Deactivated successfully. Mar 25 02:01:42.224038 containerd[1523]: time="2025-03-25T02:01:42.223973909Z" level=info msg="CreateContainer within sandbox \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 02:01:42.242249 containerd[1523]: time="2025-03-25T02:01:42.239181536Z" level=info msg="Container b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:42.253149 containerd[1523]: time="2025-03-25T02:01:42.253094941Z" level=info msg="CreateContainer within sandbox \"400bb9e25b07d2db8a63da6fe0fd50b5c79647a1dc4d46766b0a852c1181a107\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\"" Mar 25 02:01:42.254520 containerd[1523]: time="2025-03-25T02:01:42.254483152Z" level=info msg="StartContainer for \"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\"" Mar 25 02:01:42.256502 containerd[1523]: time="2025-03-25T02:01:42.256443562Z" level=info msg="connecting to shim b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc" address="unix:///run/containerd/s/6b4ddcd07646daf4168aadb3caf8fa4349e7a8e47d0c29ab45d00121f4d20f6e" protocol=ttrpc version=3 Mar 25 02:01:42.291017 systemd[1]: Started cri-containerd-b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc.scope - libcontainer container b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc. Mar 25 02:01:42.373840 containerd[1523]: time="2025-03-25T02:01:42.373752211Z" level=info msg="StartContainer for \"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\" returns successfully" Mar 25 02:01:42.563979 systemd[1]: Created slice kubepods-besteffort-pod4e5a6d29_b71c_47a3_a360_e7528cf68f9c.slice - libcontainer container kubepods-besteffort-pod4e5a6d29_b71c_47a3_a360_e7528cf68f9c.slice. Mar 25 02:01:42.630052 kubelet[2782]: I0325 02:01:42.629982 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdv5n\" (UniqueName: \"kubernetes.io/projected/4e5a6d29-b71c-47a3-a360-e7528cf68f9c-kube-api-access-zdv5n\") pod \"calico-kube-controllers-778957858-llbmd\" (UID: \"4e5a6d29-b71c-47a3-a360-e7528cf68f9c\") " pod="calico-system/calico-kube-controllers-778957858-llbmd" Mar 25 02:01:42.630052 kubelet[2782]: I0325 02:01:42.630058 2782 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e5a6d29-b71c-47a3-a360-e7528cf68f9c-tigera-ca-bundle\") pod \"calico-kube-controllers-778957858-llbmd\" (UID: \"4e5a6d29-b71c-47a3-a360-e7528cf68f9c\") " pod="calico-system/calico-kube-controllers-778957858-llbmd" Mar 25 02:01:42.871573 containerd[1523]: time="2025-03-25T02:01:42.871506922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-778957858-llbmd,Uid:4e5a6d29-b71c-47a3-a360-e7528cf68f9c,Namespace:calico-system,Attempt:0,}" Mar 25 02:01:43.145997 systemd-networkd[1448]: cali68e5fd6121b: Link UP Mar 25 02:01:43.146383 systemd-networkd[1448]: cali68e5fd6121b: Gained carrier Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:42.993 [INFO][5454] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0 calico-kube-controllers-778957858- calico-system 4e5a6d29-b71c-47a3-a360-e7528cf68f9c 1054 0 2025-03-25 02:01:38 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:778957858 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-u0apo.gb1.brightbox.com calico-kube-controllers-778957858-llbmd eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali68e5fd6121b [] []}} ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:42.993 [INFO][5454] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.085 [INFO][5465] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" HandleID="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.097 [INFO][5465] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" HandleID="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ed7c0), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-u0apo.gb1.brightbox.com", "pod":"calico-kube-controllers-778957858-llbmd", "timestamp":"2025-03-25 02:01:43.085865439 +0000 UTC"}, Hostname:"srv-u0apo.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.097 [INFO][5465] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.097 [INFO][5465] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.098 [INFO][5465] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-u0apo.gb1.brightbox.com' Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.100 [INFO][5465] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.106 [INFO][5465] ipam/ipam.go 372: Looking up existing affinities for host host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.112 [INFO][5465] ipam/ipam.go 489: Trying affinity for 192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.115 [INFO][5465] ipam/ipam.go 155: Attempting to load block cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.118 [INFO][5465] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.118 [INFO][5465] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.120 [INFO][5465] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4 Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.126 [INFO][5465] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.133 [INFO][5465] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.116.71/26] block=192.168.116.64/26 handle="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.133 [INFO][5465] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.116.71/26] handle="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" host="srv-u0apo.gb1.brightbox.com" Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.133 [INFO][5465] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:01:43.193783 containerd[1523]: 2025-03-25 02:01:43.134 [INFO][5465] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.71/26] IPv6=[] ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" HandleID="k8s-pod-network.0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.195476 containerd[1523]: 2025-03-25 02:01:43.138 [INFO][5454] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0", GenerateName:"calico-kube-controllers-778957858-", Namespace:"calico-system", SelfLink:"", UID:"4e5a6d29-b71c-47a3-a360-e7528cf68f9c", ResourceVersion:"1054", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 1, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"778957858", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-778957858-llbmd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali68e5fd6121b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:43.195476 containerd[1523]: 2025-03-25 02:01:43.140 [INFO][5454] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.116.71/32] ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.195476 containerd[1523]: 2025-03-25 02:01:43.140 [INFO][5454] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68e5fd6121b ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.195476 containerd[1523]: 2025-03-25 02:01:43.145 [INFO][5454] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.195476 containerd[1523]: 2025-03-25 02:01:43.145 [INFO][5454] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0", GenerateName:"calico-kube-controllers-778957858-", Namespace:"calico-system", SelfLink:"", UID:"4e5a6d29-b71c-47a3-a360-e7528cf68f9c", ResourceVersion:"1054", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 2, 1, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"778957858", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-u0apo.gb1.brightbox.com", ContainerID:"0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4", Pod:"calico-kube-controllers-778957858-llbmd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali68e5fd6121b", MAC:"62:df:2d:1b:cc:4d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 02:01:43.195476 containerd[1523]: 2025-03-25 02:01:43.186 [INFO][5454] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" Namespace="calico-system" Pod="calico-kube-controllers-778957858-llbmd" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--778957858--llbmd-eth0" Mar 25 02:01:43.288578 kubelet[2782]: I0325 02:01:43.286792 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v7ngd" podStartSLOduration=6.286768371 podStartE2EDuration="6.286768371s" podCreationTimestamp="2025-03-25 02:01:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:01:43.281311183 +0000 UTC m=+80.646278444" watchObservedRunningTime="2025-03-25 02:01:43.286768371 +0000 UTC m=+80.651735615" Mar 25 02:01:43.303241 containerd[1523]: time="2025-03-25T02:01:43.303159122Z" level=info msg="connecting to shim 0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4" address="unix:///run/containerd/s/145c7998fa9cab07dccd1d5ac89b609933c83756464bf97ef799bfe90b28d3f9" namespace=k8s.io protocol=ttrpc version=3 Mar 25 02:01:43.358510 systemd[1]: Started cri-containerd-0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4.scope - libcontainer container 0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4. Mar 25 02:01:43.421594 containerd[1523]: time="2025-03-25T02:01:43.421207891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\" id:\"1d893af59398adaf5f569bfb65ee3c15f4a4fd1932ce69d5d1d1471d8c3f218c\" pid:5502 exit_status:1 exited_at:{seconds:1742868103 nanos:420606448}" Mar 25 02:01:43.468938 containerd[1523]: time="2025-03-25T02:01:43.468831954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-778957858-llbmd,Uid:4e5a6d29-b71c-47a3-a360-e7528cf68f9c,Namespace:calico-system,Attempt:0,} returns sandbox id \"0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4\"" Mar 25 02:01:43.496227 containerd[1523]: time="2025-03-25T02:01:43.496168957Z" level=info msg="CreateContainer within sandbox \"0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 02:01:43.503799 containerd[1523]: time="2025-03-25T02:01:43.503745728Z" level=info msg="Container acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f: CDI devices from CRI Config.CDIDevices: []" Mar 25 02:01:43.511697 containerd[1523]: time="2025-03-25T02:01:43.511650744Z" level=info msg="CreateContainer within sandbox \"0c97c5d9f48fa3e8d788baf94951b36d0c2cd77fedb52ef13a02def0055bd0b4\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\"" Mar 25 02:01:43.531290 containerd[1523]: time="2025-03-25T02:01:43.531193231Z" level=info msg="StartContainer for \"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\"" Mar 25 02:01:43.534014 containerd[1523]: time="2025-03-25T02:01:43.533444304Z" level=info msg="connecting to shim acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f" address="unix:///run/containerd/s/145c7998fa9cab07dccd1d5ac89b609933c83756464bf97ef799bfe90b28d3f9" protocol=ttrpc version=3 Mar 25 02:01:43.561765 systemd[1]: Started cri-containerd-acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f.scope - libcontainer container acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f. Mar 25 02:01:43.595658 systemd[1]: Started sshd@9-10.230.42.214:22-139.178.68.195:44748.service - OpenSSH per-connection server daemon (139.178.68.195:44748). Mar 25 02:01:43.708332 containerd[1523]: time="2025-03-25T02:01:43.707706303Z" level=info msg="StartContainer for \"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\" returns successfully" Mar 25 02:01:44.629596 sshd[5575]: Accepted publickey for core from 139.178.68.195 port 44748 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:01:44.637976 sshd-session[5575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:01:44.685379 systemd-logind[1502]: New session 12 of user core. Mar 25 02:01:44.695022 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 02:01:44.747448 systemd-networkd[1448]: cali68e5fd6121b: Gained IPv6LL Mar 25 02:01:45.183568 containerd[1523]: time="2025-03-25T02:01:45.183397027Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\" id:\"9b5d19161dc6dc559c8427f5b857a19a7a369aff2eda0e90743b42ce65a8a4ae\" pid:5694 exit_status:1 exited_at:{seconds:1742868105 nanos:182108878}" Mar 25 02:01:45.704091 containerd[1523]: time="2025-03-25T02:01:45.703747629Z" level=info msg="TaskExit event in podsandbox handler container_id:\"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\" id:\"c02b458e4c8d231803a4a7cc660364cec17ca993cdda25b55f72b8950c9710ba\" pid:5765 exit_status:1 exited_at:{seconds:1742868105 nanos:702563811}" Mar 25 02:01:46.239474 sshd[5690]: Connection closed by 139.178.68.195 port 44748 Mar 25 02:01:46.241636 sshd-session[5575]: pam_unix(sshd:session): session closed for user core Mar 25 02:01:46.254262 systemd[1]: sshd@9-10.230.42.214:22-139.178.68.195:44748.service: Deactivated successfully. Mar 25 02:01:46.260065 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 02:01:46.264344 systemd-logind[1502]: Session 12 logged out. Waiting for processes to exit. Mar 25 02:01:46.266466 systemd-logind[1502]: Removed session 12. Mar 25 02:01:46.610398 containerd[1523]: time="2025-03-25T02:01:46.610116928Z" level=info msg="TaskExit event in podsandbox handler container_id:\"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\" id:\"c8f11bcb918ac9861fbec1ca09a8b59aaf42c7c1c4a395b815c12a78df72593d\" pid:5860 exit_status:1 exited_at:{seconds:1742868106 nanos:595957423}" Mar 25 02:01:48.761326 kubelet[2782]: I0325 02:01:48.760959 2782 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 02:01:48.896075 kubelet[2782]: I0325 02:01:48.875482 2782 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-778957858-llbmd" podStartSLOduration=10.875412123 podStartE2EDuration="10.875412123s" podCreationTimestamp="2025-03-25 02:01:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 02:01:44.874752733 +0000 UTC m=+82.239720002" watchObservedRunningTime="2025-03-25 02:01:48.875412123 +0000 UTC m=+86.240379367" Mar 25 02:01:50.431462 update_engine[1504]: I20250325 02:01:50.431203 1504 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 25 02:01:50.431462 update_engine[1504]: I20250325 02:01:50.431360 1504 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 25 02:01:50.434162 update_engine[1504]: I20250325 02:01:50.433912 1504 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 25 02:01:50.435750 update_engine[1504]: I20250325 02:01:50.435688 1504 omaha_request_params.cc:62] Current group set to alpha Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436055 1504 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436082 1504 update_attempter.cc:643] Scheduling an action processor start. Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436116 1504 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436192 1504 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436308 1504 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436328 1504 omaha_request_action.cc:272] Request: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: Mar 25 02:01:50.436816 update_engine[1504]: I20250325 02:01:50.436342 1504 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:01:50.473994 update_engine[1504]: I20250325 02:01:50.473368 1504 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:01:50.473994 update_engine[1504]: I20250325 02:01:50.473878 1504 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:01:50.476099 locksmithd[1531]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 25 02:01:50.481081 update_engine[1504]: E20250325 02:01:50.480934 1504 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:01:50.481081 update_engine[1504]: I20250325 02:01:50.481038 1504 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 25 02:01:51.399311 systemd[1]: Started sshd@10-10.230.42.214:22-139.178.68.195:40618.service - OpenSSH per-connection server daemon (139.178.68.195:40618). Mar 25 02:01:52.387449 sshd[5882]: Accepted publickey for core from 139.178.68.195 port 40618 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:01:52.394664 sshd-session[5882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:01:52.403361 systemd-logind[1502]: New session 13 of user core. Mar 25 02:01:52.410779 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 02:01:53.375883 sshd[5886]: Connection closed by 139.178.68.195 port 40618 Mar 25 02:01:53.382022 sshd-session[5882]: pam_unix(sshd:session): session closed for user core Mar 25 02:01:53.393892 systemd-logind[1502]: Session 13 logged out. Waiting for processes to exit. Mar 25 02:01:53.394672 systemd[1]: sshd@10-10.230.42.214:22-139.178.68.195:40618.service: Deactivated successfully. Mar 25 02:01:53.398582 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 02:01:53.400325 systemd-logind[1502]: Removed session 13. Mar 25 02:01:58.535026 systemd[1]: Started sshd@11-10.230.42.214:22-139.178.68.195:60506.service - OpenSSH per-connection server daemon (139.178.68.195:60506). Mar 25 02:01:59.478056 sshd[5909]: Accepted publickey for core from 139.178.68.195 port 60506 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:01:59.481058 sshd-session[5909]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:01:59.489610 systemd-logind[1502]: New session 14 of user core. Mar 25 02:01:59.499868 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 02:02:00.230427 sshd[5911]: Connection closed by 139.178.68.195 port 60506 Mar 25 02:02:00.231590 sshd-session[5909]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:00.237646 systemd[1]: sshd@11-10.230.42.214:22-139.178.68.195:60506.service: Deactivated successfully. Mar 25 02:02:00.241799 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 02:02:00.243901 systemd-logind[1502]: Session 14 logged out. Waiting for processes to exit. Mar 25 02:02:00.245413 systemd-logind[1502]: Removed session 14. Mar 25 02:02:00.367053 update_engine[1504]: I20250325 02:02:00.366811 1504 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:02:00.367826 update_engine[1504]: I20250325 02:02:00.367686 1504 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:02:00.368370 update_engine[1504]: I20250325 02:02:00.368325 1504 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:02:00.374398 update_engine[1504]: E20250325 02:02:00.368770 1504 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:02:00.374398 update_engine[1504]: I20250325 02:02:00.368867 1504 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 25 02:02:00.389451 systemd[1]: Started sshd@12-10.230.42.214:22-139.178.68.195:60512.service - OpenSSH per-connection server daemon (139.178.68.195:60512). Mar 25 02:02:01.326757 sshd[5924]: Accepted publickey for core from 139.178.68.195 port 60512 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:01.328777 sshd-session[5924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:01.336766 systemd-logind[1502]: New session 15 of user core. Mar 25 02:02:01.340805 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 02:02:02.138905 sshd[5926]: Connection closed by 139.178.68.195 port 60512 Mar 25 02:02:02.139476 sshd-session[5924]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:02.145925 systemd-logind[1502]: Session 15 logged out. Waiting for processes to exit. Mar 25 02:02:02.146435 systemd[1]: sshd@12-10.230.42.214:22-139.178.68.195:60512.service: Deactivated successfully. Mar 25 02:02:02.149025 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 02:02:02.150525 systemd-logind[1502]: Removed session 15. Mar 25 02:02:02.297507 systemd[1]: Started sshd@13-10.230.42.214:22-139.178.68.195:60528.service - OpenSSH per-connection server daemon (139.178.68.195:60528). Mar 25 02:02:03.227433 sshd[5935]: Accepted publickey for core from 139.178.68.195 port 60528 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:03.229535 sshd-session[5935]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:03.239028 systemd-logind[1502]: New session 16 of user core. Mar 25 02:02:03.246802 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 02:02:03.964006 sshd[5937]: Connection closed by 139.178.68.195 port 60528 Mar 25 02:02:03.964495 sshd-session[5935]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:03.969779 systemd-logind[1502]: Session 16 logged out. Waiting for processes to exit. Mar 25 02:02:03.970831 systemd[1]: sshd@13-10.230.42.214:22-139.178.68.195:60528.service: Deactivated successfully. Mar 25 02:02:03.974279 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 02:02:03.976074 systemd-logind[1502]: Removed session 16. Mar 25 02:02:09.129123 systemd[1]: Started sshd@14-10.230.42.214:22-139.178.68.195:48840.service - OpenSSH per-connection server daemon (139.178.68.195:48840). Mar 25 02:02:10.052356 sshd[5963]: Accepted publickey for core from 139.178.68.195 port 48840 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:10.056597 sshd-session[5963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:10.066254 systemd-logind[1502]: New session 17 of user core. Mar 25 02:02:10.072867 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 02:02:10.367584 update_engine[1504]: I20250325 02:02:10.366986 1504 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:02:10.368344 update_engine[1504]: I20250325 02:02:10.367751 1504 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:02:10.368501 update_engine[1504]: I20250325 02:02:10.368371 1504 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:02:10.368901 update_engine[1504]: E20250325 02:02:10.368852 1504 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:02:10.368983 update_engine[1504]: I20250325 02:02:10.368956 1504 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 25 02:02:10.842516 sshd[5966]: Connection closed by 139.178.68.195 port 48840 Mar 25 02:02:10.846740 sshd-session[5963]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:10.857704 systemd-logind[1502]: Session 17 logged out. Waiting for processes to exit. Mar 25 02:02:10.858359 systemd[1]: sshd@14-10.230.42.214:22-139.178.68.195:48840.service: Deactivated successfully. Mar 25 02:02:10.862594 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 02:02:10.864462 systemd-logind[1502]: Removed session 17. Mar 25 02:02:14.618461 containerd[1523]: time="2025-03-25T02:02:14.609286050Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\" id:\"46a4047ecd982d43262842fec92cc9148e959b91d36d7b5ed8c84ede526a329e\" pid:5989 exited_at:{seconds:1742868134 nanos:608231693}" Mar 25 02:02:16.002291 systemd[1]: Started sshd@15-10.230.42.214:22-139.178.68.195:48664.service - OpenSSH per-connection server daemon (139.178.68.195:48664). Mar 25 02:02:16.587224 containerd[1523]: time="2025-03-25T02:02:16.587037364Z" level=info msg="TaskExit event in podsandbox handler container_id:\"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\" id:\"a9e033febd78c511e97e3492e091f4094aa7f20087d642dd8706e3f63277037d\" pid:6015 exited_at:{seconds:1742868136 nanos:586410828}" Mar 25 02:02:16.970401 sshd[6001]: Accepted publickey for core from 139.178.68.195 port 48664 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:16.974741 sshd-session[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:16.983613 systemd-logind[1502]: New session 18 of user core. Mar 25 02:02:16.990814 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 02:02:17.740746 sshd[6024]: Connection closed by 139.178.68.195 port 48664 Mar 25 02:02:17.741848 sshd-session[6001]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:17.749004 systemd[1]: sshd@15-10.230.42.214:22-139.178.68.195:48664.service: Deactivated successfully. Mar 25 02:02:17.751732 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 02:02:17.752808 systemd-logind[1502]: Session 18 logged out. Waiting for processes to exit. Mar 25 02:02:17.754731 systemd-logind[1502]: Removed session 18. Mar 25 02:02:20.368701 update_engine[1504]: I20250325 02:02:20.367786 1504 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:02:20.368701 update_engine[1504]: I20250325 02:02:20.368417 1504 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:02:20.369785 update_engine[1504]: I20250325 02:02:20.369017 1504 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:02:20.369785 update_engine[1504]: E20250325 02:02:20.369695 1504 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:02:20.369963 update_engine[1504]: I20250325 02:02:20.369798 1504 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 02:02:20.369963 update_engine[1504]: I20250325 02:02:20.369825 1504 omaha_request_action.cc:617] Omaha request response: Mar 25 02:02:20.370063 update_engine[1504]: E20250325 02:02:20.370012 1504 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 25 02:02:20.370267 update_engine[1504]: I20250325 02:02:20.370213 1504 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 25 02:02:20.370267 update_engine[1504]: I20250325 02:02:20.370239 1504 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 02:02:20.370267 update_engine[1504]: I20250325 02:02:20.370260 1504 update_attempter.cc:306] Processing Done. Mar 25 02:02:20.370493 update_engine[1504]: E20250325 02:02:20.370325 1504 update_attempter.cc:619] Update failed. Mar 25 02:02:20.370493 update_engine[1504]: I20250325 02:02:20.370349 1504 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 25 02:02:20.370493 update_engine[1504]: I20250325 02:02:20.370362 1504 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 25 02:02:20.370493 update_engine[1504]: I20250325 02:02:20.370374 1504 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 25 02:02:20.370757 update_engine[1504]: I20250325 02:02:20.370579 1504 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 02:02:20.370757 update_engine[1504]: I20250325 02:02:20.370658 1504 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 02:02:20.370757 update_engine[1504]: I20250325 02:02:20.370676 1504 omaha_request_action.cc:272] Request: Mar 25 02:02:20.370757 update_engine[1504]: Mar 25 02:02:20.370757 update_engine[1504]: Mar 25 02:02:20.370757 update_engine[1504]: Mar 25 02:02:20.370757 update_engine[1504]: Mar 25 02:02:20.370757 update_engine[1504]: Mar 25 02:02:20.370757 update_engine[1504]: Mar 25 02:02:20.370757 update_engine[1504]: I20250325 02:02:20.370689 1504 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 02:02:20.371210 update_engine[1504]: I20250325 02:02:20.370893 1504 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 02:02:20.371210 update_engine[1504]: I20250325 02:02:20.371158 1504 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 02:02:20.372657 update_engine[1504]: E20250325 02:02:20.371737 1504 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 02:02:20.372657 update_engine[1504]: I20250325 02:02:20.371812 1504 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 02:02:20.372657 update_engine[1504]: I20250325 02:02:20.371829 1504 omaha_request_action.cc:617] Omaha request response: Mar 25 02:02:20.372657 update_engine[1504]: I20250325 02:02:20.371841 1504 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 02:02:20.372657 update_engine[1504]: I20250325 02:02:20.371896 1504 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 02:02:20.372657 update_engine[1504]: I20250325 02:02:20.371905 1504 update_attempter.cc:306] Processing Done. Mar 25 02:02:20.372657 update_engine[1504]: I20250325 02:02:20.371917 1504 update_attempter.cc:310] Error event sent. Mar 25 02:02:20.373038 locksmithd[1531]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 25 02:02:20.374117 update_engine[1504]: I20250325 02:02:20.373939 1504 update_check_scheduler.cc:74] Next update check in 45m15s Mar 25 02:02:20.374467 locksmithd[1531]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 25 02:02:22.906440 systemd[1]: Started sshd@16-10.230.42.214:22-139.178.68.195:48674.service - OpenSSH per-connection server daemon (139.178.68.195:48674). Mar 25 02:02:22.970897 containerd[1523]: time="2025-03-25T02:02:22.970728313Z" level=info msg="StopPodSandbox for \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\"" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.236 [WARNING][6053] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.238 [INFO][6053] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.238 [INFO][6053] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" iface="eth0" netns="" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.239 [INFO][6053] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.239 [INFO][6053] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.445 [INFO][6060] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.447 [INFO][6060] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.447 [INFO][6060] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.460 [WARNING][6060] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.460 [INFO][6060] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.462 [INFO][6060] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:02:23.466915 containerd[1523]: 2025-03-25 02:02:23.464 [INFO][6053] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.477521 containerd[1523]: time="2025-03-25T02:02:23.477454523Z" level=info msg="TearDown network for sandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" successfully" Mar 25 02:02:23.477521 containerd[1523]: time="2025-03-25T02:02:23.477523604Z" level=info msg="StopPodSandbox for \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" returns successfully" Mar 25 02:02:23.625567 containerd[1523]: time="2025-03-25T02:02:23.625157908Z" level=info msg="RemovePodSandbox for \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\"" Mar 25 02:02:23.625567 containerd[1523]: time="2025-03-25T02:02:23.625243754Z" level=info msg="Forcibly stopping sandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\"" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.698 [WARNING][6079] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" WorkloadEndpoint="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.699 [INFO][6079] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.699 [INFO][6079] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" iface="eth0" netns="" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.699 [INFO][6079] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.699 [INFO][6079] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.749 [INFO][6086] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.750 [INFO][6086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.750 [INFO][6086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.760 [WARNING][6086] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.761 [INFO][6086] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" HandleID="k8s-pod-network.0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Workload="srv--u0apo.gb1.brightbox.com-k8s-calico--kube--controllers--5b9d7665cb--7crdx-eth0" Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.763 [INFO][6086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 02:02:23.768208 containerd[1523]: 2025-03-25 02:02:23.766 [INFO][6079] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0" Mar 25 02:02:23.770839 containerd[1523]: time="2025-03-25T02:02:23.768687747Z" level=info msg="TearDown network for sandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" successfully" Mar 25 02:02:23.783735 containerd[1523]: time="2025-03-25T02:02:23.783664024Z" level=info msg="Ensure that sandbox 0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0 in task-service has been cleanup successfully" Mar 25 02:02:23.802742 containerd[1523]: time="2025-03-25T02:02:23.802681327Z" level=info msg="RemovePodSandbox \"0708d8f4d51c266358ac4e5f9e62211202de72fd8ccac88c4eba98d55dc5c9d0\" returns successfully" Mar 25 02:02:23.816967 containerd[1523]: time="2025-03-25T02:02:23.816470092Z" level=info msg="StopPodSandbox for \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\"" Mar 25 02:02:23.816967 containerd[1523]: time="2025-03-25T02:02:23.816808370Z" level=info msg="TearDown network for sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" successfully" Mar 25 02:02:23.816967 containerd[1523]: time="2025-03-25T02:02:23.816840666Z" level=info msg="StopPodSandbox for \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" returns successfully" Mar 25 02:02:23.818105 containerd[1523]: time="2025-03-25T02:02:23.817867810Z" level=info msg="RemovePodSandbox for \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\"" Mar 25 02:02:23.818105 containerd[1523]: time="2025-03-25T02:02:23.817952223Z" level=info msg="Forcibly stopping sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\"" Mar 25 02:02:23.818452 containerd[1523]: time="2025-03-25T02:02:23.818308305Z" level=info msg="TearDown network for sandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" successfully" Mar 25 02:02:23.821702 containerd[1523]: time="2025-03-25T02:02:23.821511558Z" level=info msg="Ensure that sandbox f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd in task-service has been cleanup successfully" Mar 25 02:02:23.825191 containerd[1523]: time="2025-03-25T02:02:23.825127944Z" level=info msg="RemovePodSandbox \"f9ae888c59de529157a179b67172cf9eaa3f9a8519c3becb5e219727d4d9d8fd\" returns successfully" Mar 25 02:02:23.825601 containerd[1523]: time="2025-03-25T02:02:23.825560064Z" level=info msg="StopPodSandbox for \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\"" Mar 25 02:02:23.825775 containerd[1523]: time="2025-03-25T02:02:23.825723439Z" level=info msg="TearDown network for sandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" successfully" Mar 25 02:02:23.825775 containerd[1523]: time="2025-03-25T02:02:23.825755670Z" level=info msg="StopPodSandbox for \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" returns successfully" Mar 25 02:02:23.826957 containerd[1523]: time="2025-03-25T02:02:23.826266584Z" level=info msg="RemovePodSandbox for \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\"" Mar 25 02:02:23.826957 containerd[1523]: time="2025-03-25T02:02:23.826301832Z" level=info msg="Forcibly stopping sandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\"" Mar 25 02:02:23.826957 containerd[1523]: time="2025-03-25T02:02:23.826457997Z" level=info msg="TearDown network for sandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" successfully" Mar 25 02:02:23.833672 containerd[1523]: time="2025-03-25T02:02:23.833641275Z" level=info msg="Ensure that sandbox 5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493 in task-service has been cleanup successfully" Mar 25 02:02:23.837249 containerd[1523]: time="2025-03-25T02:02:23.837218825Z" level=info msg="RemovePodSandbox \"5eae52c6b7fd0b488d7800ca7f38ae28566487282267fbc21bd95e17f6452493\" returns successfully" Mar 25 02:02:23.868593 sshd[6037]: Accepted publickey for core from 139.178.68.195 port 48674 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:23.872944 sshd-session[6037]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:23.881658 systemd-logind[1502]: New session 19 of user core. Mar 25 02:02:23.888795 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 02:02:24.689669 sshd[6092]: Connection closed by 139.178.68.195 port 48674 Mar 25 02:02:24.690290 sshd-session[6037]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:24.698034 systemd[1]: sshd@16-10.230.42.214:22-139.178.68.195:48674.service: Deactivated successfully. Mar 25 02:02:24.701052 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 02:02:24.702735 systemd-logind[1502]: Session 19 logged out. Waiting for processes to exit. Mar 25 02:02:24.704193 systemd-logind[1502]: Removed session 19. Mar 25 02:02:24.855380 systemd[1]: Started sshd@17-10.230.42.214:22-139.178.68.195:48690.service - OpenSSH per-connection server daemon (139.178.68.195:48690). Mar 25 02:02:25.805404 sshd[6103]: Accepted publickey for core from 139.178.68.195 port 48690 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:25.806262 sshd-session[6103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:25.813908 systemd-logind[1502]: New session 20 of user core. Mar 25 02:02:25.817709 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 02:02:26.851799 sshd[6105]: Connection closed by 139.178.68.195 port 48690 Mar 25 02:02:26.854064 sshd-session[6103]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:26.862026 systemd[1]: sshd@17-10.230.42.214:22-139.178.68.195:48690.service: Deactivated successfully. Mar 25 02:02:26.865000 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 02:02:26.866823 systemd-logind[1502]: Session 20 logged out. Waiting for processes to exit. Mar 25 02:02:26.868950 systemd-logind[1502]: Removed session 20. Mar 25 02:02:27.007106 systemd[1]: Started sshd@18-10.230.42.214:22-139.178.68.195:38318.service - OpenSSH per-connection server daemon (139.178.68.195:38318). Mar 25 02:02:27.948698 sshd[6123]: Accepted publickey for core from 139.178.68.195 port 38318 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:27.951173 sshd-session[6123]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:27.958769 systemd-logind[1502]: New session 21 of user core. Mar 25 02:02:27.967778 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 02:02:29.864896 sshd[6127]: Connection closed by 139.178.68.195 port 38318 Mar 25 02:02:29.866361 sshd-session[6123]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:29.873951 systemd-logind[1502]: Session 21 logged out. Waiting for processes to exit. Mar 25 02:02:29.874419 systemd[1]: sshd@18-10.230.42.214:22-139.178.68.195:38318.service: Deactivated successfully. Mar 25 02:02:29.878228 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 02:02:29.879745 systemd-logind[1502]: Removed session 21. Mar 25 02:02:30.018947 systemd[1]: Started sshd@19-10.230.42.214:22-139.178.68.195:38332.service - OpenSSH per-connection server daemon (139.178.68.195:38332). Mar 25 02:02:30.965341 sshd[6144]: Accepted publickey for core from 139.178.68.195 port 38332 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:30.967298 sshd-session[6144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:30.978144 systemd-logind[1502]: New session 22 of user core. Mar 25 02:02:30.991337 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 02:02:32.209458 sshd[6146]: Connection closed by 139.178.68.195 port 38332 Mar 25 02:02:32.210729 sshd-session[6144]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:32.215004 systemd[1]: sshd@19-10.230.42.214:22-139.178.68.195:38332.service: Deactivated successfully. Mar 25 02:02:32.218187 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 02:02:32.220057 systemd-logind[1502]: Session 22 logged out. Waiting for processes to exit. Mar 25 02:02:32.221651 systemd-logind[1502]: Removed session 22. Mar 25 02:02:32.379383 systemd[1]: Started sshd@20-10.230.42.214:22-139.178.68.195:38336.service - OpenSSH per-connection server daemon (139.178.68.195:38336). Mar 25 02:02:33.315592 sshd[6156]: Accepted publickey for core from 139.178.68.195 port 38336 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:33.317736 sshd-session[6156]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:33.325741 systemd-logind[1502]: New session 23 of user core. Mar 25 02:02:33.331721 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 02:02:34.047576 sshd[6158]: Connection closed by 139.178.68.195 port 38336 Mar 25 02:02:34.048675 sshd-session[6156]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:34.056193 systemd-logind[1502]: Session 23 logged out. Waiting for processes to exit. Mar 25 02:02:34.056383 systemd[1]: sshd@20-10.230.42.214:22-139.178.68.195:38336.service: Deactivated successfully. Mar 25 02:02:34.061021 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 02:02:34.063210 systemd-logind[1502]: Removed session 23. Mar 25 02:02:39.206607 systemd[1]: Started sshd@21-10.230.42.214:22-139.178.68.195:60148.service - OpenSSH per-connection server daemon (139.178.68.195:60148). Mar 25 02:02:40.126614 sshd[6171]: Accepted publickey for core from 139.178.68.195 port 60148 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:40.128922 sshd-session[6171]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:40.138996 systemd-logind[1502]: New session 24 of user core. Mar 25 02:02:40.146726 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 25 02:02:40.930153 sshd[6173]: Connection closed by 139.178.68.195 port 60148 Mar 25 02:02:40.931422 sshd-session[6171]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:40.937998 systemd[1]: sshd@21-10.230.42.214:22-139.178.68.195:60148.service: Deactivated successfully. Mar 25 02:02:40.940666 systemd[1]: session-24.scope: Deactivated successfully. Mar 25 02:02:40.941836 systemd-logind[1502]: Session 24 logged out. Waiting for processes to exit. Mar 25 02:02:40.944147 systemd-logind[1502]: Removed session 24. Mar 25 02:02:43.118618 containerd[1523]: time="2025-03-25T02:02:43.118435973Z" level=info msg="TaskExit event in podsandbox handler container_id:\"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\" id:\"7b5a5dd5b30c900182c1cfd1c908d8f16cadb2b8b794d1eacdd3a4167d9e0a1e\" pid:6197 exited_at:{seconds:1742868163 nanos:87287614}" Mar 25 02:02:44.634848 containerd[1523]: time="2025-03-25T02:02:44.634518184Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b490e94ad79cac22d205e54a0983d144fbb05122429f247f5df944f76bff1cdc\" id:\"59f5c755ae5cccbfc573b5c5814b90d105f40a2ee4de94e28d6695085b08d57f\" pid:6220 exited_at:{seconds:1742868164 nanos:633843306}" Mar 25 02:02:46.091409 systemd[1]: Started sshd@22-10.230.42.214:22-139.178.68.195:56702.service - OpenSSH per-connection server daemon (139.178.68.195:56702). Mar 25 02:02:46.602912 containerd[1523]: time="2025-03-25T02:02:46.602832688Z" level=info msg="TaskExit event in podsandbox handler container_id:\"acaa8121afd082c522dcf7d90f6b74a45553837e75950c62e3765f2c2524fb4f\" id:\"03c06baa1060076fae15849433c81d0eab7e5f4e9b155388f1959bc729759cf1\" pid:6248 exited_at:{seconds:1742868166 nanos:602016048}" Mar 25 02:02:47.090426 sshd[6233]: Accepted publickey for core from 139.178.68.195 port 56702 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:47.094365 sshd-session[6233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:47.104252 systemd-logind[1502]: New session 25 of user core. Mar 25 02:02:47.110755 systemd[1]: Started session-25.scope - Session 25 of User core. Mar 25 02:02:48.164000 sshd[6257]: Connection closed by 139.178.68.195 port 56702 Mar 25 02:02:48.163837 sshd-session[6233]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:48.169997 systemd[1]: sshd@22-10.230.42.214:22-139.178.68.195:56702.service: Deactivated successfully. Mar 25 02:02:48.172857 systemd[1]: session-25.scope: Deactivated successfully. Mar 25 02:02:48.174762 systemd-logind[1502]: Session 25 logged out. Waiting for processes to exit. Mar 25 02:02:48.176619 systemd-logind[1502]: Removed session 25. Mar 25 02:02:53.326992 systemd[1]: Started sshd@23-10.230.42.214:22-139.178.68.195:56712.service - OpenSSH per-connection server daemon (139.178.68.195:56712). Mar 25 02:02:54.281196 sshd[6269]: Accepted publickey for core from 139.178.68.195 port 56712 ssh2: RSA SHA256:sItDUi79TxocWG2seAU6ao+kw82mV8e8ogWXgVLmDo8 Mar 25 02:02:54.283677 sshd-session[6269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 02:02:54.293157 systemd-logind[1502]: New session 26 of user core. Mar 25 02:02:54.300782 systemd[1]: Started session-26.scope - Session 26 of User core. Mar 25 02:02:55.127499 sshd[6271]: Connection closed by 139.178.68.195 port 56712 Mar 25 02:02:55.128491 sshd-session[6269]: pam_unix(sshd:session): session closed for user core Mar 25 02:02:55.133922 systemd[1]: sshd@23-10.230.42.214:22-139.178.68.195:56712.service: Deactivated successfully. Mar 25 02:02:55.136581 systemd[1]: session-26.scope: Deactivated successfully. Mar 25 02:02:55.137795 systemd-logind[1502]: Session 26 logged out. Waiting for processes to exit. Mar 25 02:02:55.139052 systemd-logind[1502]: Removed session 26.