Jan 29 16:11:26.065678 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Wed Jan 29 10:09:32 -00 2025 Jan 29 16:11:26.065713 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=befc9792b021bef43c896e00e1d5172b6224dbafc9b6c92b267e5e544378e681 Jan 29 16:11:26.065727 kernel: BIOS-provided physical RAM map: Jan 29 16:11:26.065743 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jan 29 16:11:26.065753 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jan 29 16:11:26.065763 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jan 29 16:11:26.065775 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Jan 29 16:11:26.065785 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Jan 29 16:11:26.065795 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jan 29 16:11:26.065805 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jan 29 16:11:26.065816 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 29 16:11:26.065826 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jan 29 16:11:26.065841 kernel: NX (Execute Disable) protection: active Jan 29 16:11:26.065852 kernel: APIC: Static calls initialized Jan 29 16:11:26.065864 kernel: SMBIOS 2.8 present. Jan 29 16:11:26.065876 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Jan 29 16:11:26.065887 kernel: Hypervisor detected: KVM Jan 29 16:11:26.065903 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 29 16:11:26.065914 kernel: kvm-clock: using sched offset of 4972451069 cycles Jan 29 16:11:26.065926 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 29 16:11:26.065938 kernel: tsc: Detected 2499.998 MHz processor Jan 29 16:11:26.065949 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 29 16:11:26.065960 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 29 16:11:26.065972 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Jan 29 16:11:26.065983 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jan 29 16:11:26.065994 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 29 16:11:26.066022 kernel: Using GB pages for direct mapping Jan 29 16:11:26.066033 kernel: ACPI: Early table checksum verification disabled Jan 29 16:11:26.066044 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Jan 29 16:11:26.066055 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066066 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066089 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066099 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Jan 29 16:11:26.066110 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066120 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066136 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066146 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 29 16:11:26.066157 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Jan 29 16:11:26.066167 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Jan 29 16:11:26.066178 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Jan 29 16:11:26.066229 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Jan 29 16:11:26.066241 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Jan 29 16:11:26.066270 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Jan 29 16:11:26.066280 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Jan 29 16:11:26.066290 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 29 16:11:26.066301 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Jan 29 16:11:26.066311 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 29 16:11:26.066321 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Jan 29 16:11:26.066332 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 29 16:11:26.066342 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Jan 29 16:11:26.066357 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 29 16:11:26.066368 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Jan 29 16:11:26.066391 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 29 16:11:26.066402 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Jan 29 16:11:26.066413 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 29 16:11:26.066423 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Jan 29 16:11:26.066434 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 29 16:11:26.066458 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Jan 29 16:11:26.066470 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 29 16:11:26.066486 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Jan 29 16:11:26.066497 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 29 16:11:26.066509 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 29 16:11:26.066520 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Jan 29 16:11:26.066531 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Jan 29 16:11:26.066543 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Jan 29 16:11:26.066555 kernel: Zone ranges: Jan 29 16:11:26.066566 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 29 16:11:26.066589 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Jan 29 16:11:26.066618 kernel: Normal empty Jan 29 16:11:26.066630 kernel: Movable zone start for each node Jan 29 16:11:26.066642 kernel: Early memory node ranges Jan 29 16:11:26.066654 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jan 29 16:11:26.066666 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Jan 29 16:11:26.066677 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Jan 29 16:11:26.066689 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 29 16:11:26.066701 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jan 29 16:11:26.066712 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Jan 29 16:11:26.066724 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 29 16:11:26.066741 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 29 16:11:26.066753 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 29 16:11:26.066765 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 29 16:11:26.066777 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 29 16:11:26.066789 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 29 16:11:26.066801 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 29 16:11:26.066812 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 29 16:11:26.066824 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 29 16:11:26.066836 kernel: TSC deadline timer available Jan 29 16:11:26.066853 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Jan 29 16:11:26.066865 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 29 16:11:26.066888 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jan 29 16:11:26.066899 kernel: Booting paravirtualized kernel on KVM Jan 29 16:11:26.066911 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 29 16:11:26.066922 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Jan 29 16:11:26.066947 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 29 16:11:26.066959 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 29 16:11:26.066971 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Jan 29 16:11:26.066987 kernel: kvm-guest: PV spinlocks enabled Jan 29 16:11:26.066999 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 29 16:11:26.067013 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=befc9792b021bef43c896e00e1d5172b6224dbafc9b6c92b267e5e544378e681 Jan 29 16:11:26.067025 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 29 16:11:26.067037 kernel: random: crng init done Jan 29 16:11:26.067049 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 29 16:11:26.067061 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 29 16:11:26.067072 kernel: Fallback order for Node 0: 0 Jan 29 16:11:26.067089 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Jan 29 16:11:26.067101 kernel: Policy zone: DMA32 Jan 29 16:11:26.067113 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 29 16:11:26.067125 kernel: software IO TLB: area num 16. Jan 29 16:11:26.067137 kernel: Memory: 1901536K/2096616K available (12288K kernel code, 2301K rwdata, 22728K rodata, 42844K init, 2348K bss, 194820K reserved, 0K cma-reserved) Jan 29 16:11:26.067149 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Jan 29 16:11:26.067161 kernel: Kernel/User page tables isolation: enabled Jan 29 16:11:26.067173 kernel: ftrace: allocating 37921 entries in 149 pages Jan 29 16:11:26.067184 kernel: ftrace: allocated 149 pages with 4 groups Jan 29 16:11:26.067351 kernel: Dynamic Preempt: voluntary Jan 29 16:11:26.067363 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 29 16:11:26.067374 kernel: rcu: RCU event tracing is enabled. Jan 29 16:11:26.067385 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Jan 29 16:11:26.067396 kernel: Trampoline variant of Tasks RCU enabled. Jan 29 16:11:26.067418 kernel: Rude variant of Tasks RCU enabled. Jan 29 16:11:26.067433 kernel: Tracing variant of Tasks RCU enabled. Jan 29 16:11:26.067444 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 29 16:11:26.067455 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Jan 29 16:11:26.067478 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Jan 29 16:11:26.067489 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 29 16:11:26.067501 kernel: Console: colour VGA+ 80x25 Jan 29 16:11:26.067517 kernel: printk: console [tty0] enabled Jan 29 16:11:26.067528 kernel: printk: console [ttyS0] enabled Jan 29 16:11:26.067539 kernel: ACPI: Core revision 20230628 Jan 29 16:11:26.067551 kernel: APIC: Switch to symmetric I/O mode setup Jan 29 16:11:26.067562 kernel: x2apic enabled Jan 29 16:11:26.067590 kernel: APIC: Switched APIC routing to: physical x2apic Jan 29 16:11:26.067617 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 29 16:11:26.067630 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Jan 29 16:11:26.067642 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 29 16:11:26.067654 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 29 16:11:26.067666 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 29 16:11:26.067678 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 29 16:11:26.067690 kernel: Spectre V2 : Mitigation: Retpolines Jan 29 16:11:26.067702 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 29 16:11:26.067714 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Jan 29 16:11:26.067732 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Jan 29 16:11:26.067744 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 29 16:11:26.067756 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 29 16:11:26.067768 kernel: MDS: Mitigation: Clear CPU buffers Jan 29 16:11:26.067780 kernel: MMIO Stale Data: Unknown: No mitigations Jan 29 16:11:26.067791 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 29 16:11:26.067803 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 29 16:11:26.067816 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 29 16:11:26.067827 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 29 16:11:26.067839 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 29 16:11:26.067856 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Jan 29 16:11:26.067881 kernel: Freeing SMP alternatives memory: 32K Jan 29 16:11:26.067892 kernel: pid_max: default: 32768 minimum: 301 Jan 29 16:11:26.067904 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 29 16:11:26.067915 kernel: landlock: Up and running. Jan 29 16:11:26.067927 kernel: SELinux: Initializing. Jan 29 16:11:26.067938 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 29 16:11:26.067950 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 29 16:11:26.067974 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Jan 29 16:11:26.067986 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 29 16:11:26.067998 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 29 16:11:26.068028 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 29 16:11:26.068040 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Jan 29 16:11:26.068052 kernel: signal: max sigframe size: 1776 Jan 29 16:11:26.068063 kernel: rcu: Hierarchical SRCU implementation. Jan 29 16:11:26.068075 kernel: rcu: Max phase no-delay instances is 400. Jan 29 16:11:26.068087 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 29 16:11:26.068098 kernel: smp: Bringing up secondary CPUs ... Jan 29 16:11:26.068110 kernel: smpboot: x86: Booting SMP configuration: Jan 29 16:11:26.068121 kernel: .... node #0, CPUs: #1 Jan 29 16:11:26.068138 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 29 16:11:26.068150 kernel: smp: Brought up 1 node, 2 CPUs Jan 29 16:11:26.068161 kernel: smpboot: Max logical packages: 16 Jan 29 16:11:26.068173 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Jan 29 16:11:26.068184 kernel: devtmpfs: initialized Jan 29 16:11:26.068196 kernel: x86/mm: Memory block size: 128MB Jan 29 16:11:26.068232 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 29 16:11:26.068245 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Jan 29 16:11:26.068256 kernel: pinctrl core: initialized pinctrl subsystem Jan 29 16:11:26.068273 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 29 16:11:26.068297 kernel: audit: initializing netlink subsys (disabled) Jan 29 16:11:26.068308 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 29 16:11:26.068319 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 29 16:11:26.068330 kernel: audit: type=2000 audit(1738167084.150:1): state=initialized audit_enabled=0 res=1 Jan 29 16:11:26.068340 kernel: cpuidle: using governor menu Jan 29 16:11:26.068351 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 29 16:11:26.068362 kernel: dca service started, version 1.12.1 Jan 29 16:11:26.068373 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Jan 29 16:11:26.068389 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Jan 29 16:11:26.068400 kernel: PCI: Using configuration type 1 for base access Jan 29 16:11:26.068422 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 29 16:11:26.068434 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 29 16:11:26.068446 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 29 16:11:26.068458 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 29 16:11:26.068469 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 29 16:11:26.068493 kernel: ACPI: Added _OSI(Module Device) Jan 29 16:11:26.068505 kernel: ACPI: Added _OSI(Processor Device) Jan 29 16:11:26.068522 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 29 16:11:26.068546 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 29 16:11:26.068558 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 29 16:11:26.068571 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 29 16:11:26.068583 kernel: ACPI: Interpreter enabled Jan 29 16:11:26.068603 kernel: ACPI: PM: (supports S0 S5) Jan 29 16:11:26.068618 kernel: ACPI: Using IOAPIC for interrupt routing Jan 29 16:11:26.068631 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 29 16:11:26.068644 kernel: PCI: Using E820 reservations for host bridge windows Jan 29 16:11:26.068662 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 29 16:11:26.068675 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 29 16:11:26.068960 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 29 16:11:26.069174 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 29 16:11:26.069371 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 29 16:11:26.069389 kernel: PCI host bridge to bus 0000:00 Jan 29 16:11:26.069553 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 29 16:11:26.069737 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 29 16:11:26.069889 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 29 16:11:26.070057 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Jan 29 16:11:26.070216 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 29 16:11:26.070398 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Jan 29 16:11:26.070549 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 29 16:11:26.070759 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Jan 29 16:11:26.071031 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Jan 29 16:11:26.071405 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Jan 29 16:11:26.071613 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Jan 29 16:11:26.071793 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Jan 29 16:11:26.071993 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 29 16:11:26.072236 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.072448 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Jan 29 16:11:26.072658 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.072838 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Jan 29 16:11:26.073066 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.074279 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Jan 29 16:11:26.074464 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.074676 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Jan 29 16:11:26.074861 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.075026 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Jan 29 16:11:26.075221 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.075387 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Jan 29 16:11:26.075560 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.078439 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Jan 29 16:11:26.078661 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 29 16:11:26.078834 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Jan 29 16:11:26.079045 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Jan 29 16:11:26.080278 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Jan 29 16:11:26.080491 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Jan 29 16:11:26.080677 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Jan 29 16:11:26.080853 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Jan 29 16:11:26.081054 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Jan 29 16:11:26.082233 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Jan 29 16:11:26.082433 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Jan 29 16:11:26.082630 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Jan 29 16:11:26.082809 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Jan 29 16:11:26.082996 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 29 16:11:26.083175 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Jan 29 16:11:26.084416 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Jan 29 16:11:26.084608 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Jan 29 16:11:26.084797 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Jan 29 16:11:26.084981 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Jan 29 16:11:26.085194 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Jan 29 16:11:26.086403 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Jan 29 16:11:26.086577 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 29 16:11:26.086758 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 29 16:11:26.086923 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 29 16:11:26.087118 kernel: pci_bus 0000:02: extended config space not accessible Jan 29 16:11:26.087340 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Jan 29 16:11:26.087553 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Jan 29 16:11:26.087740 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 29 16:11:26.087909 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 29 16:11:26.088096 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 29 16:11:26.090345 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Jan 29 16:11:26.090527 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 29 16:11:26.090714 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 29 16:11:26.090895 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 29 16:11:26.091093 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 29 16:11:26.093319 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Jan 29 16:11:26.093490 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 29 16:11:26.093682 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 29 16:11:26.093848 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 29 16:11:26.094027 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 29 16:11:26.094186 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 29 16:11:26.094369 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 29 16:11:26.094529 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 29 16:11:26.094717 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 29 16:11:26.094879 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 29 16:11:26.095061 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 29 16:11:26.095251 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 29 16:11:26.095413 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 29 16:11:26.095569 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 29 16:11:26.095766 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 29 16:11:26.095937 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 29 16:11:26.096097 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 29 16:11:26.098851 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 29 16:11:26.099028 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 29 16:11:26.099049 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 29 16:11:26.099063 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 29 16:11:26.099076 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 29 16:11:26.099097 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 29 16:11:26.099110 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 29 16:11:26.099122 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 29 16:11:26.099144 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 29 16:11:26.099156 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 29 16:11:26.099184 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 29 16:11:26.099197 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 29 16:11:26.099210 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 29 16:11:26.099222 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 29 16:11:26.099243 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 29 16:11:26.099268 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 29 16:11:26.099280 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 29 16:11:26.099295 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 29 16:11:26.099307 kernel: iommu: Default domain type: Translated Jan 29 16:11:26.099331 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 29 16:11:26.099343 kernel: PCI: Using ACPI for IRQ routing Jan 29 16:11:26.099355 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 29 16:11:26.099366 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jan 29 16:11:26.099383 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Jan 29 16:11:26.099552 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 29 16:11:26.099744 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 29 16:11:26.099920 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 29 16:11:26.099939 kernel: vgaarb: loaded Jan 29 16:11:26.099964 kernel: clocksource: Switched to clocksource kvm-clock Jan 29 16:11:26.099976 kernel: VFS: Disk quotas dquot_6.6.0 Jan 29 16:11:26.099988 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 29 16:11:26.100000 kernel: pnp: PnP ACPI init Jan 29 16:11:26.100193 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jan 29 16:11:26.100212 kernel: pnp: PnP ACPI: found 5 devices Jan 29 16:11:26.100239 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 29 16:11:26.100254 kernel: NET: Registered PF_INET protocol family Jan 29 16:11:26.100266 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 29 16:11:26.100277 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 29 16:11:26.100294 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 29 16:11:26.100305 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 29 16:11:26.100325 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 29 16:11:26.100336 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 29 16:11:26.100348 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 29 16:11:26.100360 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 29 16:11:26.100371 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 29 16:11:26.100382 kernel: NET: Registered PF_XDP protocol family Jan 29 16:11:26.100565 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Jan 29 16:11:26.100744 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 29 16:11:26.100916 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 29 16:11:26.101096 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 29 16:11:26.101291 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 29 16:11:26.101452 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 29 16:11:26.101651 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 29 16:11:26.101814 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 29 16:11:26.102005 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 29 16:11:26.102178 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 29 16:11:26.102340 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 29 16:11:26.102485 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 29 16:11:26.102672 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 29 16:11:26.102837 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 29 16:11:26.103001 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 29 16:11:26.103189 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 29 16:11:26.106295 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 29 16:11:26.106494 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 29 16:11:26.106685 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 29 16:11:26.106849 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 29 16:11:26.107011 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 29 16:11:26.108206 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 29 16:11:26.108405 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 29 16:11:26.108570 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 29 16:11:26.108758 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 29 16:11:26.108922 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 29 16:11:26.109084 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 29 16:11:26.110296 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 29 16:11:26.110468 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 29 16:11:26.110672 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 29 16:11:26.110844 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 29 16:11:26.111005 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 29 16:11:26.112219 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 29 16:11:26.112401 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 29 16:11:26.112574 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 29 16:11:26.112751 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 29 16:11:26.112914 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 29 16:11:26.113101 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 29 16:11:26.115332 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 29 16:11:26.115501 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 29 16:11:26.115689 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 29 16:11:26.115851 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 29 16:11:26.116025 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 29 16:11:26.116211 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 29 16:11:26.116474 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 29 16:11:26.116662 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 29 16:11:26.116828 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 29 16:11:26.117025 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 29 16:11:26.117200 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 29 16:11:26.119353 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 29 16:11:26.119509 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 29 16:11:26.119674 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 29 16:11:26.119825 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 29 16:11:26.120010 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Jan 29 16:11:26.120178 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jan 29 16:11:26.120368 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Jan 29 16:11:26.120544 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 29 16:11:26.120716 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Jan 29 16:11:26.120874 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jan 29 16:11:26.121039 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Jan 29 16:11:26.123241 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Jan 29 16:11:26.123476 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Jan 29 16:11:26.123755 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 29 16:11:26.124026 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Jan 29 16:11:26.124164 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Jan 29 16:11:26.124520 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 29 16:11:26.124736 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Jan 29 16:11:26.124904 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Jan 29 16:11:26.125043 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 29 16:11:26.127218 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Jan 29 16:11:26.127386 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Jan 29 16:11:26.127546 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 29 16:11:26.127723 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Jan 29 16:11:26.127899 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Jan 29 16:11:26.128048 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 29 16:11:26.128266 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Jan 29 16:11:26.128451 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Jan 29 16:11:26.128647 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 29 16:11:26.128827 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Jan 29 16:11:26.128999 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Jan 29 16:11:26.129284 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 29 16:11:26.129307 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 29 16:11:26.129320 kernel: PCI: CLS 0 bytes, default 64 Jan 29 16:11:26.129333 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 29 16:11:26.129347 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) Jan 29 16:11:26.129360 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 29 16:11:26.129386 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 29 16:11:26.129399 kernel: Initialise system trusted keyrings Jan 29 16:11:26.129420 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 29 16:11:26.129433 kernel: Key type asymmetric registered Jan 29 16:11:26.129446 kernel: Asymmetric key parser 'x509' registered Jan 29 16:11:26.129459 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 29 16:11:26.129472 kernel: io scheduler mq-deadline registered Jan 29 16:11:26.129486 kernel: io scheduler kyber registered Jan 29 16:11:26.129499 kernel: io scheduler bfq registered Jan 29 16:11:26.129678 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 29 16:11:26.129843 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 29 16:11:26.130014 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.130238 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 29 16:11:26.130412 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 29 16:11:26.130587 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.130766 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 29 16:11:26.130928 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 29 16:11:26.131108 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.131306 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 29 16:11:26.131479 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 29 16:11:26.131656 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.131821 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 29 16:11:26.132005 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 29 16:11:26.132220 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.132430 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 29 16:11:26.132614 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 29 16:11:26.132780 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.132968 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 29 16:11:26.133110 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 29 16:11:26.133288 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.133433 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 29 16:11:26.133628 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 29 16:11:26.133796 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 29 16:11:26.133817 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 29 16:11:26.133831 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 29 16:11:26.133853 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 29 16:11:26.133866 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 29 16:11:26.133880 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 29 16:11:26.133893 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 29 16:11:26.133906 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 29 16:11:26.133920 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 29 16:11:26.134119 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 29 16:11:26.134138 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 29 16:11:26.134345 kernel: rtc_cmos 00:03: registered as rtc0 Jan 29 16:11:26.134494 kernel: rtc_cmos 00:03: setting system clock to 2025-01-29T16:11:25 UTC (1738167085) Jan 29 16:11:26.134684 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 29 16:11:26.134705 kernel: intel_pstate: CPU model not supported Jan 29 16:11:26.134718 kernel: NET: Registered PF_INET6 protocol family Jan 29 16:11:26.134732 kernel: Segment Routing with IPv6 Jan 29 16:11:26.134745 kernel: In-situ OAM (IOAM) with IPv6 Jan 29 16:11:26.134758 kernel: NET: Registered PF_PACKET protocol family Jan 29 16:11:26.134771 kernel: Key type dns_resolver registered Jan 29 16:11:26.134792 kernel: IPI shorthand broadcast: enabled Jan 29 16:11:26.134810 kernel: sched_clock: Marking stable (1197004016, 240066602)->(1677613398, -240542780) Jan 29 16:11:26.134824 kernel: registered taskstats version 1 Jan 29 16:11:26.134837 kernel: Loading compiled-in X.509 certificates Jan 29 16:11:26.134850 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 1efdcbe72fc44d29e4e6411cf9a3e64046be4375' Jan 29 16:11:26.134863 kernel: Key type .fscrypt registered Jan 29 16:11:26.134876 kernel: Key type fscrypt-provisioning registered Jan 29 16:11:26.134889 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 29 16:11:26.134903 kernel: ima: Allocated hash algorithm: sha1 Jan 29 16:11:26.134921 kernel: ima: No architecture policies found Jan 29 16:11:26.134935 kernel: clk: Disabling unused clocks Jan 29 16:11:26.134965 kernel: Freeing unused kernel image (initmem) memory: 42844K Jan 29 16:11:26.134978 kernel: Write protecting the kernel read-only data: 36864k Jan 29 16:11:26.134991 kernel: Freeing unused kernel image (rodata/data gap) memory: 1848K Jan 29 16:11:26.135004 kernel: Run /init as init process Jan 29 16:11:26.135029 kernel: with arguments: Jan 29 16:11:26.135040 kernel: /init Jan 29 16:11:26.135052 kernel: with environment: Jan 29 16:11:26.135068 kernel: HOME=/ Jan 29 16:11:26.135092 kernel: TERM=linux Jan 29 16:11:26.135103 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 29 16:11:26.135117 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 29 16:11:26.135131 systemd[1]: Detected virtualization kvm. Jan 29 16:11:26.135144 systemd[1]: Detected architecture x86-64. Jan 29 16:11:26.135155 systemd[1]: Running in initrd. Jan 29 16:11:26.135167 systemd[1]: No hostname configured, using default hostname. Jan 29 16:11:26.135184 systemd[1]: Hostname set to . Jan 29 16:11:26.135196 systemd[1]: Initializing machine ID from VM UUID. Jan 29 16:11:26.135222 systemd[1]: Queued start job for default target initrd.target. Jan 29 16:11:26.135235 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 29 16:11:26.135248 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 29 16:11:26.135261 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 29 16:11:26.135273 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 29 16:11:26.135286 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 29 16:11:26.135305 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 29 16:11:26.135319 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 29 16:11:26.135332 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 29 16:11:26.135344 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 29 16:11:26.135356 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 29 16:11:26.135368 systemd[1]: Reached target paths.target - Path Units. Jan 29 16:11:26.135385 systemd[1]: Reached target slices.target - Slice Units. Jan 29 16:11:26.135397 systemd[1]: Reached target swap.target - Swaps. Jan 29 16:11:26.135410 systemd[1]: Reached target timers.target - Timer Units. Jan 29 16:11:26.135422 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 29 16:11:26.135434 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 29 16:11:26.135447 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 29 16:11:26.135459 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 29 16:11:26.135471 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 29 16:11:26.135484 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 29 16:11:26.135501 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 29 16:11:26.135513 systemd[1]: Reached target sockets.target - Socket Units. Jan 29 16:11:26.135537 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 29 16:11:26.135550 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 29 16:11:26.135562 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 29 16:11:26.135575 systemd[1]: Starting systemd-fsck-usr.service... Jan 29 16:11:26.135613 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 29 16:11:26.135628 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 29 16:11:26.135643 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 16:11:26.135708 systemd-journald[201]: Collecting audit messages is disabled. Jan 29 16:11:26.135741 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 29 16:11:26.135756 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 29 16:11:26.135770 systemd[1]: Finished systemd-fsck-usr.service. Jan 29 16:11:26.135792 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 29 16:11:26.135806 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 29 16:11:26.135820 kernel: Bridge firewalling registered Jan 29 16:11:26.135834 systemd-journald[201]: Journal started Jan 29 16:11:26.135866 systemd-journald[201]: Runtime Journal (/run/log/journal/9a09d7ac1c464f098fb3cdcb93f0163c) is 4.7M, max 38.0M, 33.2M free. Jan 29 16:11:26.071364 systemd-modules-load[202]: Inserted module 'overlay' Jan 29 16:11:26.127288 systemd-modules-load[202]: Inserted module 'br_netfilter' Jan 29 16:11:26.191525 systemd[1]: Started systemd-journald.service - Journal Service. Jan 29 16:11:26.194906 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 29 16:11:26.196034 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 16:11:26.206420 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 29 16:11:26.220562 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 29 16:11:26.233400 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 29 16:11:26.234651 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 29 16:11:26.238695 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 29 16:11:26.244359 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 29 16:11:26.250641 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 29 16:11:26.260496 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 29 16:11:26.264257 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 16:11:26.278325 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 29 16:11:26.279549 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 29 16:11:26.296123 dracut-cmdline[235]: dracut-dracut-053 Jan 29 16:11:26.299550 dracut-cmdline[235]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=befc9792b021bef43c896e00e1d5172b6224dbafc9b6c92b267e5e544378e681 Jan 29 16:11:26.316630 systemd-resolved[234]: Positive Trust Anchors: Jan 29 16:11:26.317672 systemd-resolved[234]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 29 16:11:26.318735 systemd-resolved[234]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 29 16:11:26.326383 systemd-resolved[234]: Defaulting to hostname 'linux'. Jan 29 16:11:26.328964 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 29 16:11:26.330770 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 29 16:11:26.398283 kernel: SCSI subsystem initialized Jan 29 16:11:26.410211 kernel: Loading iSCSI transport class v2.0-870. Jan 29 16:11:26.425237 kernel: iscsi: registered transport (tcp) Jan 29 16:11:26.452226 kernel: iscsi: registered transport (qla4xxx) Jan 29 16:11:26.452283 kernel: QLogic iSCSI HBA Driver Jan 29 16:11:26.506890 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 29 16:11:26.514401 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 29 16:11:26.548713 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 29 16:11:26.551080 kernel: device-mapper: uevent: version 1.0.3 Jan 29 16:11:26.551105 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 29 16:11:26.602266 kernel: raid6: sse2x4 gen() 13447 MB/s Jan 29 16:11:26.617234 kernel: raid6: sse2x2 gen() 9115 MB/s Jan 29 16:11:26.635909 kernel: raid6: sse2x1 gen() 9807 MB/s Jan 29 16:11:26.635960 kernel: raid6: using algorithm sse2x4 gen() 13447 MB/s Jan 29 16:11:26.655035 kernel: raid6: .... xor() 7666 MB/s, rmw enabled Jan 29 16:11:26.655075 kernel: raid6: using ssse3x2 recovery algorithm Jan 29 16:11:26.681232 kernel: xor: automatically using best checksumming function avx Jan 29 16:11:26.882264 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 29 16:11:26.896563 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 29 16:11:26.902459 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 29 16:11:26.928900 systemd-udevd[420]: Using default interface naming scheme 'v255'. Jan 29 16:11:26.936119 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 29 16:11:26.945409 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 29 16:11:26.966400 dracut-pre-trigger[423]: rd.md=0: removing MD RAID activation Jan 29 16:11:27.006353 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 29 16:11:27.011372 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 29 16:11:27.122766 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 29 16:11:27.132397 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 29 16:11:27.164516 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 29 16:11:27.169187 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 29 16:11:27.172054 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 29 16:11:27.173388 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 29 16:11:27.183682 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 29 16:11:27.210287 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 29 16:11:27.246193 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Jan 29 16:11:27.307109 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Jan 29 16:11:27.307365 kernel: cryptd: max_cpu_qlen set to 1000 Jan 29 16:11:27.307386 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 29 16:11:27.307402 kernel: GPT:17805311 != 125829119 Jan 29 16:11:27.307437 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 29 16:11:27.307453 kernel: GPT:17805311 != 125829119 Jan 29 16:11:27.307468 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 29 16:11:27.307484 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 29 16:11:27.307499 kernel: AVX version of gcm_enc/dec engaged. Jan 29 16:11:27.307515 kernel: AES CTR mode by8 optimization enabled Jan 29 16:11:27.284611 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 29 16:11:27.284779 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 16:11:27.285771 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 29 16:11:27.288609 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 29 16:11:27.288782 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 16:11:27.289632 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 16:11:27.308613 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 16:11:27.358200 kernel: BTRFS: device fsid 64bb5b5a-85cc-41cc-a02b-2cfaa3e93b0a devid 1 transid 38 /dev/vda3 scanned by (udev-worker) (471) Jan 29 16:11:27.379281 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 29 16:11:27.496880 kernel: ACPI: bus type USB registered Jan 29 16:11:27.496926 kernel: usbcore: registered new interface driver usbfs Jan 29 16:11:27.496946 kernel: usbcore: registered new interface driver hub Jan 29 16:11:27.496965 kernel: usbcore: registered new device driver usb Jan 29 16:11:27.496983 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (466) Jan 29 16:11:27.497002 kernel: libata version 3.00 loaded. Jan 29 16:11:27.497021 kernel: ahci 0000:00:1f.2: version 3.0 Jan 29 16:11:27.497354 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 29 16:11:27.497390 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Jan 29 16:11:27.497635 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 29 16:11:27.497833 kernel: scsi host0: ahci Jan 29 16:11:27.498068 kernel: scsi host1: ahci Jan 29 16:11:27.498297 kernel: scsi host2: ahci Jan 29 16:11:27.498539 kernel: scsi host3: ahci Jan 29 16:11:27.498759 kernel: scsi host4: ahci Jan 29 16:11:27.498988 kernel: scsi host5: ahci Jan 29 16:11:27.499223 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Jan 29 16:11:27.499245 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Jan 29 16:11:27.499264 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Jan 29 16:11:27.499282 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Jan 29 16:11:27.499317 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Jan 29 16:11:27.499337 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Jan 29 16:11:27.497913 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 16:11:27.505794 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 29 16:11:27.518379 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 29 16:11:27.524516 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 29 16:11:27.525377 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 29 16:11:27.544404 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 29 16:11:27.549225 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 29 16:11:27.555071 disk-uuid[561]: Primary Header is updated. Jan 29 16:11:27.555071 disk-uuid[561]: Secondary Entries is updated. Jan 29 16:11:27.555071 disk-uuid[561]: Secondary Header is updated. Jan 29 16:11:27.562210 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 29 16:11:27.570217 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 29 16:11:27.575618 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 16:11:27.748551 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 29 16:11:27.748637 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 29 16:11:27.756983 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 29 16:11:27.757041 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 29 16:11:27.759605 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 29 16:11:27.759649 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 29 16:11:27.770844 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 29 16:11:27.789867 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Jan 29 16:11:27.790127 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 29 16:11:27.790366 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 29 16:11:27.790606 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Jan 29 16:11:27.790828 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Jan 29 16:11:27.791030 kernel: hub 1-0:1.0: USB hub found Jan 29 16:11:27.791304 kernel: hub 1-0:1.0: 4 ports detected Jan 29 16:11:27.791498 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 29 16:11:27.791814 kernel: hub 2-0:1.0: USB hub found Jan 29 16:11:27.792040 kernel: hub 2-0:1.0: 4 ports detected Jan 29 16:11:28.023275 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 29 16:11:28.166185 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 29 16:11:28.172350 kernel: usbcore: registered new interface driver usbhid Jan 29 16:11:28.172419 kernel: usbhid: USB HID core driver Jan 29 16:11:28.179246 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 29 16:11:28.179290 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Jan 29 16:11:28.577887 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 29 16:11:28.578064 disk-uuid[562]: The operation has completed successfully. Jan 29 16:11:28.639816 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 29 16:11:28.640431 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 29 16:11:28.669418 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 29 16:11:28.674693 sh[584]: Success Jan 29 16:11:28.692210 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Jan 29 16:11:28.750922 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 29 16:11:28.759296 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 29 16:11:28.764445 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 29 16:11:28.796207 kernel: BTRFS info (device dm-0): first mount of filesystem 64bb5b5a-85cc-41cc-a02b-2cfaa3e93b0a Jan 29 16:11:28.796252 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 29 16:11:28.796271 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 29 16:11:28.796293 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 29 16:11:28.797235 kernel: BTRFS info (device dm-0): using free space tree Jan 29 16:11:28.808578 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 29 16:11:28.810069 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 29 16:11:28.817386 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 29 16:11:28.819952 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 29 16:11:28.839830 kernel: BTRFS info (device vda6): first mount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 29 16:11:28.839890 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 29 16:11:28.839910 kernel: BTRFS info (device vda6): using free space tree Jan 29 16:11:28.846213 kernel: BTRFS info (device vda6): auto enabling async discard Jan 29 16:11:28.858099 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 29 16:11:28.860753 kernel: BTRFS info (device vda6): last unmount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 29 16:11:28.867472 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 29 16:11:28.873420 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 29 16:11:28.978651 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 29 16:11:28.993593 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 29 16:11:29.024148 systemd-networkd[767]: lo: Link UP Jan 29 16:11:29.025155 systemd-networkd[767]: lo: Gained carrier Jan 29 16:11:29.028417 systemd-networkd[767]: Enumeration completed Jan 29 16:11:29.029293 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 29 16:11:29.031478 systemd-networkd[767]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 16:11:29.031099 ignition[685]: Ignition 2.19.0 Jan 29 16:11:29.031484 systemd-networkd[767]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 29 16:11:29.031118 ignition[685]: Stage: fetch-offline Jan 29 16:11:29.032493 systemd[1]: Reached target network.target - Network. Jan 29 16:11:29.031226 ignition[685]: no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:29.035451 systemd-networkd[767]: eth0: Link UP Jan 29 16:11:29.031252 ignition[685]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:29.035457 systemd-networkd[767]: eth0: Gained carrier Jan 29 16:11:29.031836 ignition[685]: parsed url from cmdline: "" Jan 29 16:11:29.035468 systemd-networkd[767]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 16:11:29.031843 ignition[685]: no config URL provided Jan 29 16:11:29.036412 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 29 16:11:29.031853 ignition[685]: reading system config file "/usr/lib/ignition/user.ign" Jan 29 16:11:29.044501 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 29 16:11:29.031880 ignition[685]: no config at "/usr/lib/ignition/user.ign" Jan 29 16:11:29.031889 ignition[685]: failed to fetch config: resource requires networking Jan 29 16:11:29.051554 systemd-networkd[767]: eth0: DHCPv4 address 10.230.37.146/30, gateway 10.230.37.145 acquired from 10.230.37.145 Jan 29 16:11:29.032741 ignition[685]: Ignition finished successfully Jan 29 16:11:29.068152 ignition[775]: Ignition 2.19.0 Jan 29 16:11:29.068201 ignition[775]: Stage: fetch Jan 29 16:11:29.068563 ignition[775]: no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:29.068585 ignition[775]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:29.068749 ignition[775]: parsed url from cmdline: "" Jan 29 16:11:29.068757 ignition[775]: no config URL provided Jan 29 16:11:29.068771 ignition[775]: reading system config file "/usr/lib/ignition/user.ign" Jan 29 16:11:29.068788 ignition[775]: no config at "/usr/lib/ignition/user.ign" Jan 29 16:11:29.069010 ignition[775]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 29 16:11:29.071665 ignition[775]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 29 16:11:29.071678 ignition[775]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 29 16:11:29.085842 ignition[775]: GET result: OK Jan 29 16:11:29.086766 ignition[775]: parsing config with SHA512: 5b1e9af155458abe2cfdd18a84c3e4724931be13feaea720e604fef969f50112b8fcccbc91044a6eb08b6acda4dc0e3def0516bb0b102f9f61d8c93a01ad8853 Jan 29 16:11:29.093509 unknown[775]: fetched base config from "system" Jan 29 16:11:29.094631 unknown[775]: fetched base config from "system" Jan 29 16:11:29.095372 unknown[775]: fetched user config from "openstack" Jan 29 16:11:29.095906 ignition[775]: fetch: fetch complete Jan 29 16:11:29.095925 ignition[775]: fetch: fetch passed Jan 29 16:11:29.096000 ignition[775]: Ignition finished successfully Jan 29 16:11:29.098883 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 29 16:11:29.113414 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 29 16:11:29.135644 ignition[783]: Ignition 2.19.0 Jan 29 16:11:29.135664 ignition[783]: Stage: kargs Jan 29 16:11:29.135961 ignition[783]: no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:29.135983 ignition[783]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:29.137466 ignition[783]: kargs: kargs passed Jan 29 16:11:29.137595 ignition[783]: Ignition finished successfully Jan 29 16:11:29.141057 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 29 16:11:29.148444 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 29 16:11:29.178883 ignition[789]: Ignition 2.19.0 Jan 29 16:11:29.178899 ignition[789]: Stage: disks Jan 29 16:11:29.179258 ignition[789]: no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:29.179277 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:29.183783 ignition[789]: disks: disks passed Jan 29 16:11:29.183933 ignition[789]: Ignition finished successfully Jan 29 16:11:29.185223 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 29 16:11:29.187025 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 29 16:11:29.188903 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 29 16:11:29.189846 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 29 16:11:29.191603 systemd[1]: Reached target sysinit.target - System Initialization. Jan 29 16:11:29.193147 systemd[1]: Reached target basic.target - Basic System. Jan 29 16:11:29.200366 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 29 16:11:29.223600 systemd-fsck[797]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 29 16:11:29.227390 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 29 16:11:29.235367 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 29 16:11:29.361250 kernel: EXT4-fs (vda9): mounted filesystem 9f41abed-fd12-4e57-bcd4-5c0ef7f8a1bf r/w with ordered data mode. Quota mode: none. Jan 29 16:11:29.362833 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 29 16:11:29.364318 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 29 16:11:29.372346 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 29 16:11:29.377237 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 29 16:11:29.378630 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 29 16:11:29.380986 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 29 16:11:29.383859 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 29 16:11:29.383898 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 29 16:11:29.391198 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (805) Jan 29 16:11:29.395187 kernel: BTRFS info (device vda6): first mount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 29 16:11:29.400919 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 29 16:11:29.400954 kernel: BTRFS info (device vda6): using free space tree Jan 29 16:11:29.403972 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 29 16:11:29.413248 kernel: BTRFS info (device vda6): auto enabling async discard Jan 29 16:11:29.415410 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 29 16:11:29.418717 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 29 16:11:29.487961 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory Jan 29 16:11:29.496859 initrd-setup-root[841]: cut: /sysroot/etc/group: No such file or directory Jan 29 16:11:29.505570 initrd-setup-root[848]: cut: /sysroot/etc/shadow: No such file or directory Jan 29 16:11:29.512816 initrd-setup-root[855]: cut: /sysroot/etc/gshadow: No such file or directory Jan 29 16:11:29.615220 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 29 16:11:29.621348 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 29 16:11:29.625351 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 29 16:11:29.637227 kernel: BTRFS info (device vda6): last unmount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 29 16:11:29.667413 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 29 16:11:29.671611 ignition[922]: INFO : Ignition 2.19.0 Jan 29 16:11:29.671611 ignition[922]: INFO : Stage: mount Jan 29 16:11:29.673317 ignition[922]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:29.673317 ignition[922]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:29.675146 ignition[922]: INFO : mount: mount passed Jan 29 16:11:29.675146 ignition[922]: INFO : Ignition finished successfully Jan 29 16:11:29.676130 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 29 16:11:29.789766 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 29 16:11:30.123954 systemd-networkd[767]: eth0: Gained IPv6LL Jan 29 16:11:31.630423 systemd-networkd[767]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8964:24:19ff:fee6:2592/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8964:24:19ff:fee6:2592/64 assigned by NDisc. Jan 29 16:11:31.630440 systemd-networkd[767]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 29 16:11:36.555296 coreos-metadata[807]: Jan 29 16:11:36.555 WARN failed to locate config-drive, using the metadata service API instead Jan 29 16:11:36.580651 coreos-metadata[807]: Jan 29 16:11:36.580 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 29 16:11:36.592947 coreos-metadata[807]: Jan 29 16:11:36.592 INFO Fetch successful Jan 29 16:11:36.593844 coreos-metadata[807]: Jan 29 16:11:36.593 INFO wrote hostname srv-6bdnt.gb1.brightbox.com to /sysroot/etc/hostname Jan 29 16:11:36.597402 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 29 16:11:36.597563 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 29 16:11:36.610756 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 29 16:11:36.619149 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 29 16:11:36.646212 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (939) Jan 29 16:11:36.651868 kernel: BTRFS info (device vda6): first mount of filesystem aa75aabd-8755-4402-b4b6-23093345fe03 Jan 29 16:11:36.651926 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 29 16:11:36.654825 kernel: BTRFS info (device vda6): using free space tree Jan 29 16:11:36.659230 kernel: BTRFS info (device vda6): auto enabling async discard Jan 29 16:11:36.662295 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 29 16:11:36.694262 ignition[957]: INFO : Ignition 2.19.0 Jan 29 16:11:36.695453 ignition[957]: INFO : Stage: files Jan 29 16:11:36.696107 ignition[957]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:36.696107 ignition[957]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:36.698313 ignition[957]: DEBUG : files: compiled without relabeling support, skipping Jan 29 16:11:36.704681 ignition[957]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 29 16:11:36.704681 ignition[957]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 29 16:11:36.711042 ignition[957]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 29 16:11:36.712113 ignition[957]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 29 16:11:36.712113 ignition[957]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 29 16:11:36.711935 unknown[957]: wrote ssh authorized keys file for user: core Jan 29 16:11:36.715197 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 29 16:11:36.715197 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 29 16:11:36.954837 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 29 16:11:38.048457 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 29 16:11:38.048457 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 29 16:11:38.052180 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Jan 29 16:11:38.665464 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 29 16:11:40.438877 ignition[957]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Jan 29 16:11:40.438877 ignition[957]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 29 16:11:40.444160 ignition[957]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 29 16:11:40.444160 ignition[957]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 29 16:11:40.444160 ignition[957]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 29 16:11:40.444160 ignition[957]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 29 16:11:40.444160 ignition[957]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 29 16:11:40.444160 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 29 16:11:40.444160 ignition[957]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 29 16:11:40.444160 ignition[957]: INFO : files: files passed Jan 29 16:11:40.444160 ignition[957]: INFO : Ignition finished successfully Jan 29 16:11:40.443309 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 29 16:11:40.454441 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 29 16:11:40.459628 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 29 16:11:40.462209 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 29 16:11:40.462364 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 29 16:11:40.477423 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 29 16:11:40.477423 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 29 16:11:40.481156 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 29 16:11:40.483900 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 29 16:11:40.485039 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 29 16:11:40.492373 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 29 16:11:40.525960 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 29 16:11:40.526075 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 29 16:11:40.527557 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 29 16:11:40.528688 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 29 16:11:40.530518 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 29 16:11:40.536361 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 29 16:11:40.555420 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 29 16:11:40.563422 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 29 16:11:40.577271 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 29 16:11:40.579300 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 29 16:11:40.580284 systemd[1]: Stopped target timers.target - Timer Units. Jan 29 16:11:40.582078 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 29 16:11:40.582290 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 29 16:11:40.584322 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 29 16:11:40.585303 systemd[1]: Stopped target basic.target - Basic System. Jan 29 16:11:40.586827 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 29 16:11:40.588342 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 29 16:11:40.589804 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 29 16:11:40.591509 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 29 16:11:40.593089 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 29 16:11:40.594787 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 29 16:11:40.596414 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 29 16:11:40.597988 systemd[1]: Stopped target swap.target - Swaps. Jan 29 16:11:40.599357 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 29 16:11:40.599536 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 29 16:11:40.601363 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 29 16:11:40.602336 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 29 16:11:40.603742 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 29 16:11:40.606314 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 29 16:11:40.607391 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 29 16:11:40.607615 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 29 16:11:40.609603 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 29 16:11:40.609778 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 29 16:11:40.611546 systemd[1]: ignition-files.service: Deactivated successfully. Jan 29 16:11:40.611700 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 29 16:11:40.619548 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 29 16:11:40.620376 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 29 16:11:40.620629 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 29 16:11:40.625436 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 29 16:11:40.628237 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 29 16:11:40.629337 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 29 16:11:40.633473 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 29 16:11:40.634464 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 29 16:11:40.640418 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 29 16:11:40.648295 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 29 16:11:40.652457 ignition[1010]: INFO : Ignition 2.19.0 Jan 29 16:11:40.652457 ignition[1010]: INFO : Stage: umount Jan 29 16:11:40.652457 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 29 16:11:40.652457 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 29 16:11:40.652457 ignition[1010]: INFO : umount: umount passed Jan 29 16:11:40.652457 ignition[1010]: INFO : Ignition finished successfully Jan 29 16:11:40.650891 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 29 16:11:40.651040 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 29 16:11:40.656685 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 29 16:11:40.656783 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 29 16:11:40.659326 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 29 16:11:40.659394 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 29 16:11:40.660897 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 29 16:11:40.660963 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 29 16:11:40.662401 systemd[1]: Stopped target network.target - Network. Jan 29 16:11:40.663122 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 29 16:11:40.663214 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 29 16:11:40.663980 systemd[1]: Stopped target paths.target - Path Units. Jan 29 16:11:40.667258 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 29 16:11:40.672288 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 29 16:11:40.673077 systemd[1]: Stopped target slices.target - Slice Units. Jan 29 16:11:40.673814 systemd[1]: Stopped target sockets.target - Socket Units. Jan 29 16:11:40.675516 systemd[1]: iscsid.socket: Deactivated successfully. Jan 29 16:11:40.675590 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 29 16:11:40.676854 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 29 16:11:40.676917 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 29 16:11:40.678426 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 29 16:11:40.678505 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 29 16:11:40.680063 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 29 16:11:40.680128 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 29 16:11:40.682385 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 29 16:11:40.683480 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 29 16:11:40.686153 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 29 16:11:40.686317 systemd-networkd[767]: eth0: DHCPv6 lease lost Jan 29 16:11:40.688136 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 29 16:11:40.688983 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 29 16:11:40.690504 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 29 16:11:40.690644 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 29 16:11:40.693136 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 29 16:11:40.693630 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 29 16:11:40.696692 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 29 16:11:40.696759 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 29 16:11:40.704386 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 29 16:11:40.705595 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 29 16:11:40.706683 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 29 16:11:40.709333 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 29 16:11:40.710660 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 29 16:11:40.710806 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 29 16:11:40.716665 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 29 16:11:40.716765 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 29 16:11:40.721718 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 29 16:11:40.721784 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 29 16:11:40.723385 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 29 16:11:40.723452 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 29 16:11:40.731686 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 29 16:11:40.732742 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 29 16:11:40.734592 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 29 16:11:40.734757 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 29 16:11:40.737033 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 29 16:11:40.737146 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 29 16:11:40.738323 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 29 16:11:40.738381 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 29 16:11:40.740022 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 29 16:11:40.740106 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 29 16:11:40.742424 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 29 16:11:40.742494 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 29 16:11:40.744077 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 29 16:11:40.744163 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 29 16:11:40.751377 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 29 16:11:40.752389 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 29 16:11:40.752459 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 29 16:11:40.756387 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 29 16:11:40.756458 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 16:11:40.762097 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 29 16:11:40.762274 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 29 16:11:40.764600 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 29 16:11:40.771377 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 29 16:11:40.782706 systemd[1]: Switching root. Jan 29 16:11:40.813695 systemd-journald[201]: Journal stopped Jan 29 16:11:42.220782 systemd-journald[201]: Received SIGTERM from PID 1 (systemd). Jan 29 16:11:42.220886 kernel: SELinux: policy capability network_peer_controls=1 Jan 29 16:11:42.220924 kernel: SELinux: policy capability open_perms=1 Jan 29 16:11:42.220945 kernel: SELinux: policy capability extended_socket_class=1 Jan 29 16:11:42.220963 kernel: SELinux: policy capability always_check_network=0 Jan 29 16:11:42.220980 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 29 16:11:42.221004 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 29 16:11:42.221038 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 29 16:11:42.221059 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 29 16:11:42.221082 kernel: audit: type=1403 audit(1738167101.030:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 29 16:11:42.221107 systemd[1]: Successfully loaded SELinux policy in 47.546ms. Jan 29 16:11:42.221147 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 21.713ms. Jan 29 16:11:42.221167 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 29 16:11:42.223233 systemd[1]: Detected virtualization kvm. Jan 29 16:11:42.223263 systemd[1]: Detected architecture x86-64. Jan 29 16:11:42.223284 systemd[1]: Detected first boot. Jan 29 16:11:42.223323 systemd[1]: Hostname set to . Jan 29 16:11:42.223345 systemd[1]: Initializing machine ID from VM UUID. Jan 29 16:11:42.223366 zram_generator::config[1052]: No configuration found. Jan 29 16:11:42.223387 systemd[1]: Populated /etc with preset unit settings. Jan 29 16:11:42.223407 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 29 16:11:42.223435 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 29 16:11:42.223458 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 29 16:11:42.223480 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 29 16:11:42.223515 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 29 16:11:42.223545 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 29 16:11:42.223565 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 29 16:11:42.223592 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 29 16:11:42.223613 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 29 16:11:42.223640 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 29 16:11:42.223670 systemd[1]: Created slice user.slice - User and Session Slice. Jan 29 16:11:42.223690 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 29 16:11:42.223710 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 29 16:11:42.223765 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 29 16:11:42.223786 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 29 16:11:42.224943 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 29 16:11:42.224975 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 29 16:11:42.224994 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 29 16:11:42.225012 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 29 16:11:42.225040 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 29 16:11:42.225073 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 29 16:11:42.225103 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 29 16:11:42.225122 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 29 16:11:42.225141 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 29 16:11:42.225160 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 29 16:11:42.225222 systemd[1]: Reached target slices.target - Slice Units. Jan 29 16:11:42.225245 systemd[1]: Reached target swap.target - Swaps. Jan 29 16:11:42.225266 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 29 16:11:42.225300 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 29 16:11:42.225322 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 29 16:11:42.225343 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 29 16:11:42.225363 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 29 16:11:42.225396 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 29 16:11:42.225431 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 29 16:11:42.225480 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 29 16:11:42.225501 systemd[1]: Mounting media.mount - External Media Directory... Jan 29 16:11:42.225535 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:42.225556 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 29 16:11:42.225577 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 29 16:11:42.225612 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 29 16:11:42.225636 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 29 16:11:42.225657 systemd[1]: Reached target machines.target - Containers. Jan 29 16:11:42.225701 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 29 16:11:42.225724 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 29 16:11:42.225745 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 29 16:11:42.225777 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 29 16:11:42.225797 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 29 16:11:42.225816 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 29 16:11:42.225854 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 29 16:11:42.225873 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 29 16:11:42.225891 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 29 16:11:42.225933 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 29 16:11:42.225959 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 29 16:11:42.225978 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 29 16:11:42.225997 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 29 16:11:42.226018 systemd[1]: Stopped systemd-fsck-usr.service. Jan 29 16:11:42.226036 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 29 16:11:42.226055 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 29 16:11:42.226074 kernel: loop: module loaded Jan 29 16:11:42.226094 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 29 16:11:42.226138 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 29 16:11:42.226159 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 29 16:11:42.227260 kernel: ACPI: bus type drm_connector registered Jan 29 16:11:42.227298 kernel: fuse: init (API version 7.39) Jan 29 16:11:42.227321 systemd[1]: verity-setup.service: Deactivated successfully. Jan 29 16:11:42.227342 systemd[1]: Stopped verity-setup.service. Jan 29 16:11:42.227362 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:42.227383 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 29 16:11:42.227420 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 29 16:11:42.227442 systemd[1]: Mounted media.mount - External Media Directory. Jan 29 16:11:42.227506 systemd-journald[1145]: Collecting audit messages is disabled. Jan 29 16:11:42.227570 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 29 16:11:42.227606 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 29 16:11:42.227628 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 29 16:11:42.227649 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 29 16:11:42.227678 systemd-journald[1145]: Journal started Jan 29 16:11:42.227716 systemd-journald[1145]: Runtime Journal (/run/log/journal/9a09d7ac1c464f098fb3cdcb93f0163c) is 4.7M, max 38.0M, 33.2M free. Jan 29 16:11:41.823843 systemd[1]: Queued start job for default target multi-user.target. Jan 29 16:11:41.841781 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 29 16:11:41.842575 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 29 16:11:42.230254 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 29 16:11:42.232254 systemd[1]: Started systemd-journald.service - Journal Service. Jan 29 16:11:42.234401 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 29 16:11:42.234690 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 29 16:11:42.236076 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 29 16:11:42.236390 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 29 16:11:42.237598 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 29 16:11:42.237841 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 29 16:11:42.239209 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 29 16:11:42.239410 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 29 16:11:42.240703 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 29 16:11:42.240961 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 29 16:11:42.242131 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 29 16:11:42.242503 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 29 16:11:42.243627 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 29 16:11:42.244893 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 29 16:11:42.246027 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 29 16:11:42.262041 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 29 16:11:42.273272 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 29 16:11:42.282372 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 29 16:11:42.284301 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 29 16:11:42.284352 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 29 16:11:42.287630 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 29 16:11:42.294378 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 29 16:11:42.297634 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 29 16:11:42.299383 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 29 16:11:42.305363 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 29 16:11:42.314056 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 29 16:11:42.314955 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 29 16:11:42.318342 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 29 16:11:42.319159 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 29 16:11:42.322980 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 29 16:11:42.330396 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 29 16:11:42.338791 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 29 16:11:42.344514 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 29 16:11:42.345505 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 29 16:11:42.347698 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 29 16:11:42.366070 systemd-journald[1145]: Time spent on flushing to /var/log/journal/9a09d7ac1c464f098fb3cdcb93f0163c is 42.733ms for 1138 entries. Jan 29 16:11:42.366070 systemd-journald[1145]: System Journal (/var/log/journal/9a09d7ac1c464f098fb3cdcb93f0163c) is 8.0M, max 584.8M, 576.8M free. Jan 29 16:11:42.437113 systemd-journald[1145]: Received client request to flush runtime journal. Jan 29 16:11:42.437212 kernel: loop0: detected capacity change from 0 to 142488 Jan 29 16:11:42.437243 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 29 16:11:42.396616 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 29 16:11:42.397669 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 29 16:11:42.409421 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 29 16:11:42.450682 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 29 16:11:42.476101 kernel: loop1: detected capacity change from 0 to 205544 Jan 29 16:11:42.474316 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 29 16:11:42.475214 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 29 16:11:42.482526 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 29 16:11:42.514649 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 29 16:11:42.522072 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 29 16:11:42.554197 kernel: loop2: detected capacity change from 0 to 140768 Jan 29 16:11:42.573773 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 29 16:11:42.585486 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 29 16:11:42.606032 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Jan 29 16:11:42.606059 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Jan 29 16:11:42.623213 kernel: loop3: detected capacity change from 0 to 8 Jan 29 16:11:42.637990 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 29 16:11:42.641956 udevadm[1206]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Jan 29 16:11:42.661698 kernel: loop4: detected capacity change from 0 to 142488 Jan 29 16:11:42.693216 kernel: loop5: detected capacity change from 0 to 205544 Jan 29 16:11:42.719204 kernel: loop6: detected capacity change from 0 to 140768 Jan 29 16:11:42.746211 kernel: loop7: detected capacity change from 0 to 8 Jan 29 16:11:42.758128 (sd-merge)[1210]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Jan 29 16:11:42.758971 (sd-merge)[1210]: Merged extensions into '/usr'. Jan 29 16:11:42.766853 systemd[1]: Reloading requested from client PID 1185 ('systemd-sysext') (unit systemd-sysext.service)... Jan 29 16:11:42.766875 systemd[1]: Reloading... Jan 29 16:11:42.877218 zram_generator::config[1235]: No configuration found. Jan 29 16:11:43.092565 ldconfig[1180]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 29 16:11:43.182032 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 16:11:43.252667 systemd[1]: Reloading finished in 485 ms. Jan 29 16:11:43.283694 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 29 16:11:43.286920 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 29 16:11:43.299413 systemd[1]: Starting ensure-sysext.service... Jan 29 16:11:43.305667 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 29 16:11:43.315034 systemd[1]: Reloading requested from client PID 1292 ('systemctl') (unit ensure-sysext.service)... Jan 29 16:11:43.315240 systemd[1]: Reloading... Jan 29 16:11:43.403243 zram_generator::config[1319]: No configuration found. Jan 29 16:11:43.405259 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 29 16:11:43.405863 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 29 16:11:43.407383 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 29 16:11:43.407792 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Jan 29 16:11:43.407910 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Jan 29 16:11:43.416942 systemd-tmpfiles[1293]: Detected autofs mount point /boot during canonicalization of boot. Jan 29 16:11:43.416960 systemd-tmpfiles[1293]: Skipping /boot Jan 29 16:11:43.446071 systemd-tmpfiles[1293]: Detected autofs mount point /boot during canonicalization of boot. Jan 29 16:11:43.446090 systemd-tmpfiles[1293]: Skipping /boot Jan 29 16:11:43.610492 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 16:11:43.678964 systemd[1]: Reloading finished in 363 ms. Jan 29 16:11:43.701099 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 29 16:11:43.708734 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 29 16:11:43.729568 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 29 16:11:43.735467 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 29 16:11:43.746286 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 29 16:11:43.755486 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 29 16:11:43.760259 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 29 16:11:43.770336 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 29 16:11:43.774694 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:43.774965 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 29 16:11:43.780643 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 29 16:11:43.786022 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 29 16:11:43.790471 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 29 16:11:43.791390 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 29 16:11:43.799510 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 29 16:11:43.800262 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:43.804605 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:43.804890 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 29 16:11:43.805109 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 29 16:11:43.806316 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:43.816100 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:43.816525 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 29 16:11:43.829543 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 29 16:11:43.831357 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 29 16:11:43.831629 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 29 16:11:43.835410 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 29 16:11:43.837064 systemd[1]: Finished ensure-sysext.service. Jan 29 16:11:43.838984 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 29 16:11:43.840351 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 29 16:11:43.848113 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 29 16:11:43.865199 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 29 16:11:43.875361 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 29 16:11:43.887198 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 29 16:11:43.888257 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 29 16:11:43.888282 systemd-udevd[1389]: Using default interface naming scheme 'v255'. Jan 29 16:11:43.891192 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 29 16:11:43.894805 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 29 16:11:43.895051 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 29 16:11:43.901260 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 29 16:11:43.903382 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 29 16:11:43.904601 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 29 16:11:43.907275 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 29 16:11:43.907361 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 29 16:11:43.928770 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 29 16:11:43.932341 augenrules[1414]: No rules Jan 29 16:11:43.934575 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 29 16:11:43.942352 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 29 16:11:43.956237 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 29 16:11:43.983820 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 29 16:11:44.091345 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 29 16:11:44.128945 systemd-resolved[1387]: Positive Trust Anchors: Jan 29 16:11:44.128973 systemd-resolved[1387]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 29 16:11:44.129018 systemd-resolved[1387]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 29 16:11:44.152840 systemd-resolved[1387]: Using system hostname 'srv-6bdnt.gb1.brightbox.com'. Jan 29 16:11:44.164516 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 29 16:11:44.171844 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 29 16:11:44.177997 systemd-networkd[1427]: lo: Link UP Jan 29 16:11:44.178022 systemd-networkd[1427]: lo: Gained carrier Jan 29 16:11:44.184130 systemd-networkd[1427]: Enumeration completed Jan 29 16:11:44.184332 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 29 16:11:44.184942 systemd-networkd[1427]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 16:11:44.184948 systemd-networkd[1427]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 29 16:11:44.185331 systemd[1]: Reached target network.target - Network. Jan 29 16:11:44.188296 systemd-networkd[1427]: eth0: Link UP Jan 29 16:11:44.188309 systemd-networkd[1427]: eth0: Gained carrier Jan 29 16:11:44.188327 systemd-networkd[1427]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 16:11:44.195442 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 29 16:11:44.198966 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 29 16:11:44.200125 systemd[1]: Reached target time-set.target - System Time Set. Jan 29 16:11:44.215326 systemd-networkd[1427]: eth0: DHCPv4 address 10.230.37.146/30, gateway 10.230.37.145 acquired from 10.230.37.145 Jan 29 16:11:44.217337 systemd-timesyncd[1406]: Network configuration changed, trying to establish connection. Jan 29 16:11:44.248215 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1431) Jan 29 16:11:44.271527 systemd-networkd[1427]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 29 16:11:44.310212 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 29 16:11:44.310295 kernel: mousedev: PS/2 mouse device common for all mice Jan 29 16:11:44.315193 kernel: ACPI: button: Power Button [PWRF] Jan 29 16:11:44.366964 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 29 16:11:44.371206 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 29 16:11:44.378389 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 29 16:11:44.393460 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 29 16:11:44.401665 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Jan 29 16:11:44.401953 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 29 16:11:44.397588 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 29 16:11:44.439890 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 29 16:11:44.617888 systemd-timesyncd[1406]: Contacted time server 80.1.72.242:123 (0.flatcar.pool.ntp.org). Jan 29 16:11:44.619868 systemd-timesyncd[1406]: Initial clock synchronization to Wed 2025-01-29 16:11:44.826517 UTC. Jan 29 16:11:44.648236 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 29 16:11:44.723247 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 29 16:11:44.724877 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 29 16:11:44.742245 lvm[1464]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 29 16:11:44.781556 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 29 16:11:44.783500 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 29 16:11:44.784407 systemd[1]: Reached target sysinit.target - System Initialization. Jan 29 16:11:44.785356 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 29 16:11:44.786428 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 29 16:11:44.787771 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 29 16:11:44.788750 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 29 16:11:44.789580 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 29 16:11:44.790398 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 29 16:11:44.790472 systemd[1]: Reached target paths.target - Path Units. Jan 29 16:11:44.791153 systemd[1]: Reached target timers.target - Timer Units. Jan 29 16:11:44.793454 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 29 16:11:44.796999 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 29 16:11:44.804195 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 29 16:11:44.807049 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 29 16:11:44.808794 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 29 16:11:44.809730 systemd[1]: Reached target sockets.target - Socket Units. Jan 29 16:11:44.810424 systemd[1]: Reached target basic.target - Basic System. Jan 29 16:11:44.811210 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 29 16:11:44.811266 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 29 16:11:44.827226 systemd[1]: Starting containerd.service - containerd container runtime... Jan 29 16:11:44.830396 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 29 16:11:44.832587 lvm[1469]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 29 16:11:44.840394 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 29 16:11:44.845324 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 29 16:11:44.852241 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 29 16:11:44.854200 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 29 16:11:44.856401 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 29 16:11:44.864308 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 29 16:11:44.868421 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 29 16:11:44.873946 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 29 16:11:44.887373 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 29 16:11:44.891839 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 29 16:11:44.894828 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 29 16:11:44.898526 systemd[1]: Starting update-engine.service - Update Engine... Jan 29 16:11:44.902839 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 29 16:11:44.940109 jq[1473]: false Jan 29 16:11:44.941708 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 29 16:11:44.957354 dbus-daemon[1472]: [system] SELinux support is enabled Jan 29 16:11:44.962469 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 29 16:11:44.971436 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 29 16:11:44.991421 extend-filesystems[1474]: Found loop4 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found loop5 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found loop6 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found loop7 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda1 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda2 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda3 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found usr Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda4 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda6 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda7 Jan 29 16:11:44.991421 extend-filesystems[1474]: Found vda9 Jan 29 16:11:44.991421 extend-filesystems[1474]: Checking size of /dev/vda9 Jan 29 16:11:44.971752 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 29 16:11:45.006110 dbus-daemon[1472]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1427 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Jan 29 16:11:45.116672 jq[1484]: true Jan 29 16:11:45.116924 extend-filesystems[1474]: Resized partition /dev/vda9 Jan 29 16:11:44.980757 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 29 16:11:45.118471 tar[1491]: linux-amd64/helm Jan 29 16:11:44.981074 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 29 16:11:45.000684 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 29 16:11:45.121444 jq[1500]: true Jan 29 16:11:45.000750 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 29 16:11:45.011086 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 29 16:11:45.011126 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 29 16:11:45.039768 (ntainerd)[1493]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 29 16:11:45.051447 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Jan 29 16:11:45.131216 extend-filesystems[1511]: resize2fs 1.47.1 (20-May-2024) Jan 29 16:11:45.156071 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (1436) Jan 29 16:11:45.148875 systemd-logind[1482]: Watching system buttons on /dev/input/event2 (Power Button) Jan 29 16:11:45.156676 update_engine[1483]: I20250129 16:11:45.136022 1483 main.cc:92] Flatcar Update Engine starting Jan 29 16:11:45.148922 systemd-logind[1482]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 29 16:11:45.149472 systemd-logind[1482]: New seat seat0. Jan 29 16:11:45.155330 systemd[1]: Started systemd-logind.service - User Login Management. Jan 29 16:11:45.169369 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Jan 29 16:11:45.173747 systemd[1]: motdgen.service: Deactivated successfully. Jan 29 16:11:45.176730 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 29 16:11:45.185639 systemd[1]: Started update-engine.service - Update Engine. Jan 29 16:11:45.195982 update_engine[1483]: I20250129 16:11:45.188776 1483 update_check_scheduler.cc:74] Next update check in 3m30s Jan 29 16:11:45.209592 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 29 16:11:45.310243 bash[1529]: Updated "/home/core/.ssh/authorized_keys" Jan 29 16:11:45.311286 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 29 16:11:45.329496 systemd[1]: Starting sshkeys.service... Jan 29 16:11:45.399466 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 29 16:11:45.410670 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 29 16:11:45.506236 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Jan 29 16:11:45.531023 extend-filesystems[1511]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 29 16:11:45.531023 extend-filesystems[1511]: old_desc_blocks = 1, new_desc_blocks = 8 Jan 29 16:11:45.531023 extend-filesystems[1511]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Jan 29 16:11:45.545837 extend-filesystems[1474]: Resized filesystem in /dev/vda9 Jan 29 16:11:45.533286 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 29 16:11:45.533577 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 29 16:11:45.552626 dbus-daemon[1472]: [system] Successfully activated service 'org.freedesktop.hostname1' Jan 29 16:11:45.552831 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Jan 29 16:11:45.555586 dbus-daemon[1472]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1504 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Jan 29 16:11:45.570053 systemd[1]: Starting polkit.service - Authorization Manager... Jan 29 16:11:45.606727 polkitd[1546]: Started polkitd version 121 Jan 29 16:11:45.620806 polkitd[1546]: Loading rules from directory /etc/polkit-1/rules.d Jan 29 16:11:45.620929 polkitd[1546]: Loading rules from directory /usr/share/polkit-1/rules.d Jan 29 16:11:45.625429 polkitd[1546]: Finished loading, compiling and executing 2 rules Jan 29 16:11:45.630265 locksmithd[1515]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 29 16:11:45.632062 dbus-daemon[1472]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Jan 29 16:11:45.632360 systemd[1]: Started polkit.service - Authorization Manager. Jan 29 16:11:45.633258 polkitd[1546]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Jan 29 16:11:45.668973 containerd[1493]: time="2025-01-29T16:11:45.668799117Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Jan 29 16:11:45.678905 systemd-hostnamed[1504]: Hostname set to (static) Jan 29 16:11:45.715154 containerd[1493]: time="2025-01-29T16:11:45.715082394Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.718644 containerd[1493]: time="2025-01-29T16:11:45.718602305Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 29 16:11:45.718838 containerd[1493]: time="2025-01-29T16:11:45.718801218Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 29 16:11:45.718948 containerd[1493]: time="2025-01-29T16:11:45.718924258Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 29 16:11:45.719566 containerd[1493]: time="2025-01-29T16:11:45.719537217Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 29 16:11:45.719700 containerd[1493]: time="2025-01-29T16:11:45.719673800Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.719906327Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.719934991Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.720179063Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.720206638Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.720273067Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.720295396Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.720507385Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.720891185Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.721021818Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 29 16:11:45.721227 containerd[1493]: time="2025-01-29T16:11:45.721054478Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 29 16:11:45.721872 containerd[1493]: time="2025-01-29T16:11:45.721199834Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 29 16:11:45.722133 containerd[1493]: time="2025-01-29T16:11:45.722107141Z" level=info msg="metadata content store policy set" policy=shared Jan 29 16:11:45.726080 containerd[1493]: time="2025-01-29T16:11:45.726045736Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.726331768Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.726368630Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.726423667Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.726466848Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.726668844Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.726959708Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727132116Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727158215Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727178169Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727228799Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727282733Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727311893Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727344987Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728215 containerd[1493]: time="2025-01-29T16:11:45.727391620Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727416466Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727435714Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727463486Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727503469Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727536612Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727573700Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727607035Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727641627Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727661737Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727680281Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727700430Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727723879Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727744884Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.728791 containerd[1493]: time="2025-01-29T16:11:45.727762333Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.727804154Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.727826600Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.727855829Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.727893865Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.727915845Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.727933816Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728003168Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728036919Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728056404Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728075047Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728090499Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728107984Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728135608Z" level=info msg="NRI interface is disabled by configuration." Jan 29 16:11:45.729313 containerd[1493]: time="2025-01-29T16:11:45.728165448Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 29 16:11:45.731232 containerd[1493]: time="2025-01-29T16:11:45.730590167Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 29 16:11:45.731232 containerd[1493]: time="2025-01-29T16:11:45.730705325Z" level=info msg="Connect containerd service" Jan 29 16:11:45.731232 containerd[1493]: time="2025-01-29T16:11:45.730781886Z" level=info msg="using legacy CRI server" Jan 29 16:11:45.731232 containerd[1493]: time="2025-01-29T16:11:45.730799548Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 29 16:11:45.731232 containerd[1493]: time="2025-01-29T16:11:45.730971752Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 29 16:11:45.732420 containerd[1493]: time="2025-01-29T16:11:45.732387649Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 29 16:11:45.732858 containerd[1493]: time="2025-01-29T16:11:45.732783474Z" level=info msg="Start subscribing containerd event" Jan 29 16:11:45.732916 containerd[1493]: time="2025-01-29T16:11:45.732878199Z" level=info msg="Start recovering state" Jan 29 16:11:45.733642 containerd[1493]: time="2025-01-29T16:11:45.733004330Z" level=info msg="Start event monitor" Jan 29 16:11:45.733642 containerd[1493]: time="2025-01-29T16:11:45.733037797Z" level=info msg="Start snapshots syncer" Jan 29 16:11:45.733642 containerd[1493]: time="2025-01-29T16:11:45.733054283Z" level=info msg="Start cni network conf syncer for default" Jan 29 16:11:45.733642 containerd[1493]: time="2025-01-29T16:11:45.733067733Z" level=info msg="Start streaming server" Jan 29 16:11:45.733642 containerd[1493]: time="2025-01-29T16:11:45.733548631Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 29 16:11:45.734047 containerd[1493]: time="2025-01-29T16:11:45.734020541Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 29 16:11:45.734504 systemd[1]: Started containerd.service - containerd container runtime. Jan 29 16:11:45.744800 containerd[1493]: time="2025-01-29T16:11:45.744770801Z" level=info msg="containerd successfully booted in 0.079028s" Jan 29 16:11:45.807455 sshd_keygen[1490]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 29 16:11:45.839267 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 29 16:11:45.851428 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 29 16:11:45.861512 systemd[1]: issuegen.service: Deactivated successfully. Jan 29 16:11:45.861851 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 29 16:11:45.871515 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 29 16:11:45.893705 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 29 16:11:45.903701 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 29 16:11:45.914740 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 29 16:11:45.916182 systemd[1]: Reached target getty.target - Login Prompts. Jan 29 16:11:46.059435 systemd-networkd[1427]: eth0: Gained IPv6LL Jan 29 16:11:46.063348 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 29 16:11:46.075887 systemd[1]: Reached target network-online.target - Network is Online. Jan 29 16:11:46.092432 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:11:46.095825 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 29 16:11:46.156133 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 29 16:11:46.160904 tar[1491]: linux-amd64/LICENSE Jan 29 16:11:46.161645 tar[1491]: linux-amd64/README.md Jan 29 16:11:46.176492 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 29 16:11:46.591883 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 29 16:11:46.606845 systemd[1]: Started sshd@0-10.230.37.146:22-139.178.68.195:49810.service - OpenSSH per-connection server daemon (139.178.68.195:49810). Jan 29 16:11:47.106382 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:11:47.114048 (kubelet)[1599]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 16:11:47.548930 sshd[1592]: Accepted publickey for core from 139.178.68.195 port 49810 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:11:47.559353 sshd[1592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:11:47.590498 systemd-networkd[1427]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8964:24:19ff:fee6:2592/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8964:24:19ff:fee6:2592/64 assigned by NDisc. Jan 29 16:11:47.590512 systemd-networkd[1427]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 29 16:11:47.600785 systemd-logind[1482]: New session 1 of user core. Jan 29 16:11:47.606098 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 29 16:11:47.616654 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 29 16:11:47.647833 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 29 16:11:47.659708 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 29 16:11:47.673553 (systemd)[1608]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 29 16:11:47.835415 systemd[1608]: Queued start job for default target default.target. Jan 29 16:11:47.845799 kubelet[1599]: E0129 16:11:47.843690 1599 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 16:11:47.848179 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 16:11:47.848534 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 16:11:47.848898 systemd[1608]: Created slice app.slice - User Application Slice. Jan 29 16:11:47.848935 systemd[1608]: Reached target paths.target - Paths. Jan 29 16:11:47.848958 systemd[1608]: Reached target timers.target - Timers. Jan 29 16:11:47.849683 systemd[1]: kubelet.service: Consumed 1.086s CPU time. Jan 29 16:11:47.854400 systemd[1608]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 29 16:11:47.878509 systemd[1608]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 29 16:11:47.878711 systemd[1608]: Reached target sockets.target - Sockets. Jan 29 16:11:47.878751 systemd[1608]: Reached target basic.target - Basic System. Jan 29 16:11:47.878826 systemd[1608]: Reached target default.target - Main User Target. Jan 29 16:11:47.878893 systemd[1608]: Startup finished in 194ms. Jan 29 16:11:47.879228 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 29 16:11:47.891619 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 29 16:11:48.534100 systemd[1]: Started sshd@1-10.230.37.146:22-139.178.68.195:49820.service - OpenSSH per-connection server daemon (139.178.68.195:49820). Jan 29 16:11:49.429019 sshd[1622]: Accepted publickey for core from 139.178.68.195 port 49820 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:11:49.431464 sshd[1622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:11:49.439485 systemd-logind[1482]: New session 2 of user core. Jan 29 16:11:49.446468 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 29 16:11:50.055814 sshd[1622]: pam_unix(sshd:session): session closed for user core Jan 29 16:11:50.059967 systemd[1]: sshd@1-10.230.37.146:22-139.178.68.195:49820.service: Deactivated successfully. Jan 29 16:11:50.062156 systemd[1]: session-2.scope: Deactivated successfully. Jan 29 16:11:50.064377 systemd-logind[1482]: Session 2 logged out. Waiting for processes to exit. Jan 29 16:11:50.066095 systemd-logind[1482]: Removed session 2. Jan 29 16:11:50.215644 systemd[1]: Started sshd@2-10.230.37.146:22-139.178.68.195:49824.service - OpenSSH per-connection server daemon (139.178.68.195:49824). Jan 29 16:11:50.972572 login[1573]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 29 16:11:50.975682 login[1574]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 29 16:11:50.980779 systemd-logind[1482]: New session 3 of user core. Jan 29 16:11:50.989658 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 29 16:11:50.994322 systemd-logind[1482]: New session 4 of user core. Jan 29 16:11:51.002667 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 29 16:11:51.117022 sshd[1629]: Accepted publickey for core from 139.178.68.195 port 49824 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:11:51.119113 sshd[1629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:11:51.125942 systemd-logind[1482]: New session 5 of user core. Jan 29 16:11:51.133453 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 29 16:11:51.746657 sshd[1629]: pam_unix(sshd:session): session closed for user core Jan 29 16:11:51.751442 systemd[1]: sshd@2-10.230.37.146:22-139.178.68.195:49824.service: Deactivated successfully. Jan 29 16:11:51.754096 systemd[1]: session-5.scope: Deactivated successfully. Jan 29 16:11:51.755447 systemd-logind[1482]: Session 5 logged out. Waiting for processes to exit. Jan 29 16:11:51.757090 systemd-logind[1482]: Removed session 5. Jan 29 16:11:52.075947 coreos-metadata[1471]: Jan 29 16:11:52.075 WARN failed to locate config-drive, using the metadata service API instead Jan 29 16:11:52.107532 coreos-metadata[1471]: Jan 29 16:11:52.107 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 29 16:11:52.115947 coreos-metadata[1471]: Jan 29 16:11:52.115 INFO Fetch failed with 404: resource not found Jan 29 16:11:52.115947 coreos-metadata[1471]: Jan 29 16:11:52.115 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 29 16:11:52.116795 coreos-metadata[1471]: Jan 29 16:11:52.116 INFO Fetch successful Jan 29 16:11:52.116966 coreos-metadata[1471]: Jan 29 16:11:52.116 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 29 16:11:52.132969 coreos-metadata[1471]: Jan 29 16:11:52.132 INFO Fetch successful Jan 29 16:11:52.133265 coreos-metadata[1471]: Jan 29 16:11:52.133 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 29 16:11:52.149492 coreos-metadata[1471]: Jan 29 16:11:52.149 INFO Fetch successful Jan 29 16:11:52.149803 coreos-metadata[1471]: Jan 29 16:11:52.149 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 29 16:11:52.165252 coreos-metadata[1471]: Jan 29 16:11:52.165 INFO Fetch successful Jan 29 16:11:52.165412 coreos-metadata[1471]: Jan 29 16:11:52.165 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 29 16:11:52.185889 coreos-metadata[1471]: Jan 29 16:11:52.185 INFO Fetch successful Jan 29 16:11:52.220877 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 29 16:11:52.222519 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 29 16:11:52.545986 coreos-metadata[1537]: Jan 29 16:11:52.545 WARN failed to locate config-drive, using the metadata service API instead Jan 29 16:11:52.567693 coreos-metadata[1537]: Jan 29 16:11:52.567 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 29 16:11:52.592969 coreos-metadata[1537]: Jan 29 16:11:52.592 INFO Fetch successful Jan 29 16:11:52.593118 coreos-metadata[1537]: Jan 29 16:11:52.593 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 29 16:11:52.619590 coreos-metadata[1537]: Jan 29 16:11:52.619 INFO Fetch successful Jan 29 16:11:52.621496 unknown[1537]: wrote ssh authorized keys file for user: core Jan 29 16:11:52.640236 update-ssh-keys[1671]: Updated "/home/core/.ssh/authorized_keys" Jan 29 16:11:52.640678 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 29 16:11:52.643139 systemd[1]: Finished sshkeys.service. Jan 29 16:11:52.646568 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 29 16:11:52.647107 systemd[1]: Startup finished in 1.384s (kernel) + 15.269s (initrd) + 11.664s (userspace) = 28.317s. Jan 29 16:11:58.099162 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 29 16:11:58.116423 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:11:58.278213 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:11:58.284524 (kubelet)[1683]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 16:11:58.356628 kubelet[1683]: E0129 16:11:58.355965 1683 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 16:11:58.360026 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 16:11:58.360322 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 16:12:01.975765 systemd[1]: Started sshd@3-10.230.37.146:22-139.178.68.195:44894.service - OpenSSH per-connection server daemon (139.178.68.195:44894). Jan 29 16:12:02.859327 sshd[1691]: Accepted publickey for core from 139.178.68.195 port 44894 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:12:02.861543 sshd[1691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:12:02.868546 systemd-logind[1482]: New session 6 of user core. Jan 29 16:12:02.877419 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 29 16:12:03.480730 sshd[1691]: pam_unix(sshd:session): session closed for user core Jan 29 16:12:03.486874 systemd-logind[1482]: Session 6 logged out. Waiting for processes to exit. Jan 29 16:12:03.488816 systemd[1]: sshd@3-10.230.37.146:22-139.178.68.195:44894.service: Deactivated successfully. Jan 29 16:12:03.491629 systemd[1]: session-6.scope: Deactivated successfully. Jan 29 16:12:03.493426 systemd-logind[1482]: Removed session 6. Jan 29 16:12:03.649521 systemd[1]: Started sshd@4-10.230.37.146:22-139.178.68.195:44902.service - OpenSSH per-connection server daemon (139.178.68.195:44902). Jan 29 16:12:04.528349 sshd[1698]: Accepted publickey for core from 139.178.68.195 port 44902 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:12:04.530329 sshd[1698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:12:04.536886 systemd-logind[1482]: New session 7 of user core. Jan 29 16:12:04.545432 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 29 16:12:05.141644 sshd[1698]: pam_unix(sshd:session): session closed for user core Jan 29 16:12:05.145334 systemd-logind[1482]: Session 7 logged out. Waiting for processes to exit. Jan 29 16:12:05.146742 systemd[1]: sshd@4-10.230.37.146:22-139.178.68.195:44902.service: Deactivated successfully. Jan 29 16:12:05.148704 systemd[1]: session-7.scope: Deactivated successfully. Jan 29 16:12:05.150675 systemd-logind[1482]: Removed session 7. Jan 29 16:12:05.304609 systemd[1]: Started sshd@5-10.230.37.146:22-139.178.68.195:49472.service - OpenSSH per-connection server daemon (139.178.68.195:49472). Jan 29 16:12:06.191038 sshd[1705]: Accepted publickey for core from 139.178.68.195 port 49472 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:12:06.193042 sshd[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:12:06.200158 systemd-logind[1482]: New session 8 of user core. Jan 29 16:12:06.206406 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 29 16:12:06.813284 sshd[1705]: pam_unix(sshd:session): session closed for user core Jan 29 16:12:06.817997 systemd[1]: sshd@5-10.230.37.146:22-139.178.68.195:49472.service: Deactivated successfully. Jan 29 16:12:06.820409 systemd[1]: session-8.scope: Deactivated successfully. Jan 29 16:12:06.821615 systemd-logind[1482]: Session 8 logged out. Waiting for processes to exit. Jan 29 16:12:06.823079 systemd-logind[1482]: Removed session 8. Jan 29 16:12:06.979256 systemd[1]: Started sshd@6-10.230.37.146:22-139.178.68.195:49480.service - OpenSSH per-connection server daemon (139.178.68.195:49480). Jan 29 16:12:07.866344 sshd[1712]: Accepted publickey for core from 139.178.68.195 port 49480 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:12:07.868740 sshd[1712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:12:07.877017 systemd-logind[1482]: New session 9 of user core. Jan 29 16:12:07.883395 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 29 16:12:08.371064 sudo[1715]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 29 16:12:08.371623 sudo[1715]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 16:12:08.374599 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 29 16:12:08.383607 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:08.388798 sudo[1715]: pam_unix(sudo:session): session closed for user root Jan 29 16:12:08.535571 sshd[1712]: pam_unix(sshd:session): session closed for user core Jan 29 16:12:08.541379 systemd[1]: sshd@6-10.230.37.146:22-139.178.68.195:49480.service: Deactivated successfully. Jan 29 16:12:08.545469 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:08.547007 systemd[1]: session-9.scope: Deactivated successfully. Jan 29 16:12:08.550661 systemd-logind[1482]: Session 9 logged out. Waiting for processes to exit. Jan 29 16:12:08.556684 (kubelet)[1725]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 16:12:08.558054 systemd-logind[1482]: Removed session 9. Jan 29 16:12:08.618771 kubelet[1725]: E0129 16:12:08.618663 1725 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 16:12:08.622154 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 16:12:08.622422 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 16:12:08.696473 systemd[1]: Started sshd@7-10.230.37.146:22-139.178.68.195:49482.service - OpenSSH per-connection server daemon (139.178.68.195:49482). Jan 29 16:12:09.574259 sshd[1736]: Accepted publickey for core from 139.178.68.195 port 49482 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:12:09.576524 sshd[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:12:09.584381 systemd-logind[1482]: New session 10 of user core. Jan 29 16:12:09.591425 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 29 16:12:10.049180 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 29 16:12:10.049666 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 16:12:10.055965 sudo[1740]: pam_unix(sudo:session): session closed for user root Jan 29 16:12:10.064620 sudo[1739]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Jan 29 16:12:10.065055 sudo[1739]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 16:12:10.083652 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Jan 29 16:12:10.088396 auditctl[1743]: No rules Jan 29 16:12:10.088956 systemd[1]: audit-rules.service: Deactivated successfully. Jan 29 16:12:10.089280 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Jan 29 16:12:10.103693 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 29 16:12:10.142135 augenrules[1761]: No rules Jan 29 16:12:10.143870 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 29 16:12:10.146482 sudo[1739]: pam_unix(sudo:session): session closed for user root Jan 29 16:12:10.289745 sshd[1736]: pam_unix(sshd:session): session closed for user core Jan 29 16:12:10.293549 systemd[1]: sshd@7-10.230.37.146:22-139.178.68.195:49482.service: Deactivated successfully. Jan 29 16:12:10.296319 systemd[1]: session-10.scope: Deactivated successfully. Jan 29 16:12:10.298403 systemd-logind[1482]: Session 10 logged out. Waiting for processes to exit. Jan 29 16:12:10.299933 systemd-logind[1482]: Removed session 10. Jan 29 16:12:10.447532 systemd[1]: Started sshd@8-10.230.37.146:22-139.178.68.195:49490.service - OpenSSH per-connection server daemon (139.178.68.195:49490). Jan 29 16:12:11.345366 sshd[1769]: Accepted publickey for core from 139.178.68.195 port 49490 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:12:11.347859 sshd[1769]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:12:11.356565 systemd-logind[1482]: New session 11 of user core. Jan 29 16:12:11.365370 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 29 16:12:11.823729 sudo[1772]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 29 16:12:11.824252 sudo[1772]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 29 16:12:12.298539 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 29 16:12:12.299649 (dockerd)[1787]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 29 16:12:12.737361 dockerd[1787]: time="2025-01-29T16:12:12.736932746Z" level=info msg="Starting up" Jan 29 16:12:12.874319 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1422456145-merged.mount: Deactivated successfully. Jan 29 16:12:12.906541 dockerd[1787]: time="2025-01-29T16:12:12.906296974Z" level=info msg="Loading containers: start." Jan 29 16:12:13.044790 kernel: Initializing XFRM netlink socket Jan 29 16:12:13.143776 systemd-networkd[1427]: docker0: Link UP Jan 29 16:12:13.175083 dockerd[1787]: time="2025-01-29T16:12:13.175023567Z" level=info msg="Loading containers: done." Jan 29 16:12:13.196219 dockerd[1787]: time="2025-01-29T16:12:13.196143429Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 29 16:12:13.196394 dockerd[1787]: time="2025-01-29T16:12:13.196290919Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Jan 29 16:12:13.196470 dockerd[1787]: time="2025-01-29T16:12:13.196452963Z" level=info msg="Daemon has completed initialization" Jan 29 16:12:13.233797 dockerd[1787]: time="2025-01-29T16:12:13.233560057Z" level=info msg="API listen on /run/docker.sock" Jan 29 16:12:13.233893 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 29 16:12:14.658091 containerd[1493]: time="2025-01-29T16:12:14.657975836Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.5\"" Jan 29 16:12:15.593704 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount475073554.mount: Deactivated successfully. Jan 29 16:12:17.628011 containerd[1493]: time="2025-01-29T16:12:17.627799817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:17.630476 containerd[1493]: time="2025-01-29T16:12:17.630250222Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.5: active requests=0, bytes read=27976729" Jan 29 16:12:17.631444 containerd[1493]: time="2025-01-29T16:12:17.631354980Z" level=info msg="ImageCreate event name:\"sha256:2212e74642e45d72a36f297bea139f607ce4ccc4792966a8e9c4d30e04a4a6fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:17.636940 containerd[1493]: time="2025-01-29T16:12:17.636903025Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:fc4b366c0036b90d147f3b58244cf7d5f1f42b0db539f0fe83a8fc6e25a434ab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:17.640586 containerd[1493]: time="2025-01-29T16:12:17.639657107Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.5\" with image id \"sha256:2212e74642e45d72a36f297bea139f607ce4ccc4792966a8e9c4d30e04a4a6fb\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:fc4b366c0036b90d147f3b58244cf7d5f1f42b0db539f0fe83a8fc6e25a434ab\", size \"27973521\" in 2.981573326s" Jan 29 16:12:17.640586 containerd[1493]: time="2025-01-29T16:12:17.639724632Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.5\" returns image reference \"sha256:2212e74642e45d72a36f297bea139f607ce4ccc4792966a8e9c4d30e04a4a6fb\"" Jan 29 16:12:17.644298 containerd[1493]: time="2025-01-29T16:12:17.644253569Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.5\"" Jan 29 16:12:17.650505 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Jan 29 16:12:18.708605 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 29 16:12:18.717612 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:18.919438 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:18.920156 (kubelet)[1999]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 16:12:18.990346 kubelet[1999]: E0129 16:12:18.989813 1999 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 16:12:18.992774 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 16:12:18.993025 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 16:12:19.821521 containerd[1493]: time="2025-01-29T16:12:19.821410675Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:19.823470 containerd[1493]: time="2025-01-29T16:12:19.823401921Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.5: active requests=0, bytes read=24701151" Jan 29 16:12:19.824193 containerd[1493]: time="2025-01-29T16:12:19.823798800Z" level=info msg="ImageCreate event name:\"sha256:d7fccb640e0edce9c47bd71f2b2ce328b824bea199bfe5838dda3fe2af6372f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:19.832037 containerd[1493]: time="2025-01-29T16:12:19.831828445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:848cf42bf6c3c5ccac232b76c901c309edb3ebeac4d856885af0fc718798207e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:19.835510 containerd[1493]: time="2025-01-29T16:12:19.835459050Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.5\" with image id \"sha256:d7fccb640e0edce9c47bd71f2b2ce328b824bea199bfe5838dda3fe2af6372f2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:848cf42bf6c3c5ccac232b76c901c309edb3ebeac4d856885af0fc718798207e\", size \"26147725\" in 2.19114683s" Jan 29 16:12:19.835604 containerd[1493]: time="2025-01-29T16:12:19.835564607Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.5\" returns image reference \"sha256:d7fccb640e0edce9c47bd71f2b2ce328b824bea199bfe5838dda3fe2af6372f2\"" Jan 29 16:12:19.837391 containerd[1493]: time="2025-01-29T16:12:19.837106302Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.5\"" Jan 29 16:12:21.473845 containerd[1493]: time="2025-01-29T16:12:21.473744116Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:21.475377 containerd[1493]: time="2025-01-29T16:12:21.475310722Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.5: active requests=0, bytes read=18652061" Jan 29 16:12:21.476205 containerd[1493]: time="2025-01-29T16:12:21.476151758Z" level=info msg="ImageCreate event name:\"sha256:4b2fb209f5d1efc0fc980c5acda28886e4eb6ab4820173976bdd441cbd2ee09a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:21.480992 containerd[1493]: time="2025-01-29T16:12:21.480930917Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:0e01fd956ba32a7fa08f6b6da24e8c49015905c8e2cf752978d495e44cd4a8a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:21.482613 containerd[1493]: time="2025-01-29T16:12:21.482441233Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.5\" with image id \"sha256:4b2fb209f5d1efc0fc980c5acda28886e4eb6ab4820173976bdd441cbd2ee09a\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:0e01fd956ba32a7fa08f6b6da24e8c49015905c8e2cf752978d495e44cd4a8a9\", size \"20098653\" in 1.645286141s" Jan 29 16:12:21.482613 containerd[1493]: time="2025-01-29T16:12:21.482480501Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.5\" returns image reference \"sha256:4b2fb209f5d1efc0fc980c5acda28886e4eb6ab4820173976bdd441cbd2ee09a\"" Jan 29 16:12:21.483437 containerd[1493]: time="2025-01-29T16:12:21.483343358Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.5\"" Jan 29 16:12:23.004009 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1876466930.mount: Deactivated successfully. Jan 29 16:12:23.730986 containerd[1493]: time="2025-01-29T16:12:23.730840486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:23.733108 containerd[1493]: time="2025-01-29T16:12:23.732455226Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.5: active requests=0, bytes read=30231136" Jan 29 16:12:23.733668 containerd[1493]: time="2025-01-29T16:12:23.733622393Z" level=info msg="ImageCreate event name:\"sha256:34018aef09a62f8b40bdd1d2e1bf6c48f359cab492d51059a09e20745ab02ce2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:23.737834 containerd[1493]: time="2025-01-29T16:12:23.737769523Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c00685cc45c1fb539c5bbd8d24d2577f96e9399efac1670f688f654b30f8c64c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:23.739652 containerd[1493]: time="2025-01-29T16:12:23.738890258Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.5\" with image id \"sha256:34018aef09a62f8b40bdd1d2e1bf6c48f359cab492d51059a09e20745ab02ce2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:c00685cc45c1fb539c5bbd8d24d2577f96e9399efac1670f688f654b30f8c64c\", size \"30230147\" in 2.255325912s" Jan 29 16:12:23.739652 containerd[1493]: time="2025-01-29T16:12:23.738933840Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.5\" returns image reference \"sha256:34018aef09a62f8b40bdd1d2e1bf6c48f359cab492d51059a09e20745ab02ce2\"" Jan 29 16:12:23.740320 containerd[1493]: time="2025-01-29T16:12:23.740154347Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 29 16:12:24.409446 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2815516241.mount: Deactivated successfully. Jan 29 16:12:25.588919 containerd[1493]: time="2025-01-29T16:12:25.588825889Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:25.590797 containerd[1493]: time="2025-01-29T16:12:25.590743937Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Jan 29 16:12:25.592504 containerd[1493]: time="2025-01-29T16:12:25.592447646Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:25.599312 containerd[1493]: time="2025-01-29T16:12:25.599268509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:25.599578 containerd[1493]: time="2025-01-29T16:12:25.599533451Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.858987642s" Jan 29 16:12:25.599775 containerd[1493]: time="2025-01-29T16:12:25.599719603Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 29 16:12:25.601574 containerd[1493]: time="2025-01-29T16:12:25.601534691Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 29 16:12:26.236307 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3400818765.mount: Deactivated successfully. Jan 29 16:12:26.243219 containerd[1493]: time="2025-01-29T16:12:26.242633953Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:26.243826 containerd[1493]: time="2025-01-29T16:12:26.243763104Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" Jan 29 16:12:26.244893 containerd[1493]: time="2025-01-29T16:12:26.244794803Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:26.250200 containerd[1493]: time="2025-01-29T16:12:26.249608106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:26.250796 containerd[1493]: time="2025-01-29T16:12:26.250764408Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 649.183944ms" Jan 29 16:12:26.250962 containerd[1493]: time="2025-01-29T16:12:26.250931537Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jan 29 16:12:26.251715 containerd[1493]: time="2025-01-29T16:12:26.251667309Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jan 29 16:12:26.867685 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount433050629.mount: Deactivated successfully. Jan 29 16:12:29.208499 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 29 16:12:29.220450 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:29.729453 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:29.731786 (kubelet)[2127]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 29 16:12:29.884725 kubelet[2127]: E0129 16:12:29.884176 2127 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 29 16:12:29.889635 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 29 16:12:29.889908 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 29 16:12:30.069535 containerd[1493]: time="2025-01-29T16:12:30.069251808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:30.071669 containerd[1493]: time="2025-01-29T16:12:30.071547022Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779981" Jan 29 16:12:30.072842 containerd[1493]: time="2025-01-29T16:12:30.072717590Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:30.077688 containerd[1493]: time="2025-01-29T16:12:30.077578943Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:30.080198 containerd[1493]: time="2025-01-29T16:12:30.079650730Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 3.827800129s" Jan 29 16:12:30.080198 containerd[1493]: time="2025-01-29T16:12:30.079727123Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jan 29 16:12:30.342870 update_engine[1483]: I20250129 16:12:30.342465 1483 update_attempter.cc:509] Updating boot flags... Jan 29 16:12:30.409260 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2150) Jan 29 16:12:30.506472 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 38 scanned by (udev-worker) (2149) Jan 29 16:12:34.875257 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:34.886018 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:34.926587 systemd[1]: Reloading requested from client PID 2176 ('systemctl') (unit session-11.scope)... Jan 29 16:12:34.926829 systemd[1]: Reloading... Jan 29 16:12:35.163341 zram_generator::config[2215]: No configuration found. Jan 29 16:12:35.280783 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 16:12:35.391721 systemd[1]: Reloading finished in 463 ms. Jan 29 16:12:35.470117 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 29 16:12:35.470314 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 29 16:12:35.470767 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:35.477721 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:35.623791 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:35.635643 (kubelet)[2283]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 29 16:12:35.718258 kubelet[2283]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 16:12:35.718258 kubelet[2283]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 29 16:12:35.718258 kubelet[2283]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 16:12:35.718258 kubelet[2283]: I0129 16:12:35.715098 2283 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 29 16:12:36.184688 kubelet[2283]: I0129 16:12:36.184626 2283 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 29 16:12:36.185183 kubelet[2283]: I0129 16:12:36.184969 2283 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 29 16:12:36.185896 kubelet[2283]: I0129 16:12:36.185780 2283 server.go:929] "Client rotation is on, will bootstrap in background" Jan 29 16:12:36.220197 kubelet[2283]: I0129 16:12:36.219643 2283 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 29 16:12:36.221384 kubelet[2283]: E0129 16:12:36.221005 2283 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.37.146:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:36.233048 kubelet[2283]: E0129 16:12:36.233007 2283 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 29 16:12:36.233298 kubelet[2283]: I0129 16:12:36.233275 2283 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 29 16:12:36.242115 kubelet[2283]: I0129 16:12:36.241976 2283 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 29 16:12:36.243193 kubelet[2283]: I0129 16:12:36.242320 2283 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 29 16:12:36.243193 kubelet[2283]: I0129 16:12:36.242558 2283 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 29 16:12:36.243193 kubelet[2283]: I0129 16:12:36.242602 2283 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-6bdnt.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 29 16:12:36.243193 kubelet[2283]: I0129 16:12:36.242886 2283 topology_manager.go:138] "Creating topology manager with none policy" Jan 29 16:12:36.243835 kubelet[2283]: I0129 16:12:36.242901 2283 container_manager_linux.go:300] "Creating device plugin manager" Jan 29 16:12:36.243835 kubelet[2283]: I0129 16:12:36.243118 2283 state_mem.go:36] "Initialized new in-memory state store" Jan 29 16:12:36.247830 kubelet[2283]: I0129 16:12:36.247798 2283 kubelet.go:408] "Attempting to sync node with API server" Jan 29 16:12:36.248067 kubelet[2283]: I0129 16:12:36.247941 2283 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 29 16:12:36.248067 kubelet[2283]: I0129 16:12:36.248016 2283 kubelet.go:314] "Adding apiserver pod source" Jan 29 16:12:36.248574 kubelet[2283]: I0129 16:12:36.248254 2283 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 29 16:12:36.255119 kubelet[2283]: W0129 16:12:36.254314 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.37.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6bdnt.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:36.255119 kubelet[2283]: E0129 16:12:36.254410 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.37.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6bdnt.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:36.255119 kubelet[2283]: W0129 16:12:36.255010 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.37.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:36.255119 kubelet[2283]: E0129 16:12:36.255060 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.37.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:36.256289 kubelet[2283]: I0129 16:12:36.256047 2283 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 29 16:12:36.260195 kubelet[2283]: I0129 16:12:36.260089 2283 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 29 16:12:36.260376 kubelet[2283]: W0129 16:12:36.260355 2283 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 29 16:12:36.263915 kubelet[2283]: I0129 16:12:36.263874 2283 server.go:1269] "Started kubelet" Jan 29 16:12:36.274182 kubelet[2283]: E0129 16:12:36.271637 2283 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.37.146:6443/api/v1/namespaces/default/events\": dial tcp 10.230.37.146:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-6bdnt.gb1.brightbox.com.181f35d1c55b2f48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-6bdnt.gb1.brightbox.com,UID:srv-6bdnt.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-6bdnt.gb1.brightbox.com,},FirstTimestamp:2025-01-29 16:12:36.263825224 +0000 UTC m=+0.620910122,LastTimestamp:2025-01-29 16:12:36.263825224 +0000 UTC m=+0.620910122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-6bdnt.gb1.brightbox.com,}" Jan 29 16:12:36.275800 kubelet[2283]: I0129 16:12:36.275323 2283 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 29 16:12:36.278431 kubelet[2283]: I0129 16:12:36.278338 2283 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 29 16:12:36.279306 kubelet[2283]: I0129 16:12:36.278940 2283 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 29 16:12:36.283029 kubelet[2283]: I0129 16:12:36.282961 2283 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 29 16:12:36.286205 kubelet[2283]: I0129 16:12:36.284691 2283 server.go:460] "Adding debug handlers to kubelet server" Jan 29 16:12:36.286495 kubelet[2283]: I0129 16:12:36.286465 2283 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 29 16:12:36.286855 kubelet[2283]: I0129 16:12:36.283006 2283 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 29 16:12:36.287071 kubelet[2283]: E0129 16:12:36.286827 2283 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-6bdnt.gb1.brightbox.com\" not found" Jan 29 16:12:36.287304 kubelet[2283]: I0129 16:12:36.287277 2283 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 29 16:12:36.287423 kubelet[2283]: I0129 16:12:36.287387 2283 reconciler.go:26] "Reconciler: start to sync state" Jan 29 16:12:36.288495 kubelet[2283]: W0129 16:12:36.288432 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.37.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:36.288612 kubelet[2283]: E0129 16:12:36.288519 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.37.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:36.288694 kubelet[2283]: E0129 16:12:36.288603 2283 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.37.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6bdnt.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.37.146:6443: connect: connection refused" interval="200ms" Jan 29 16:12:36.295089 kubelet[2283]: I0129 16:12:36.295059 2283 factory.go:221] Registration of the containerd container factory successfully Jan 29 16:12:36.295089 kubelet[2283]: I0129 16:12:36.295087 2283 factory.go:221] Registration of the systemd container factory successfully Jan 29 16:12:36.295269 kubelet[2283]: I0129 16:12:36.295210 2283 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 29 16:12:36.318936 kubelet[2283]: I0129 16:12:36.318882 2283 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 29 16:12:36.320893 kubelet[2283]: I0129 16:12:36.320840 2283 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 29 16:12:36.320989 kubelet[2283]: I0129 16:12:36.320910 2283 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 29 16:12:36.320989 kubelet[2283]: I0129 16:12:36.320947 2283 kubelet.go:2321] "Starting kubelet main sync loop" Jan 29 16:12:36.321131 kubelet[2283]: E0129 16:12:36.321031 2283 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 29 16:12:36.332778 kubelet[2283]: W0129 16:12:36.332705 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.37.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:36.332873 kubelet[2283]: E0129 16:12:36.332779 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.37.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:36.336848 kubelet[2283]: E0129 16:12:36.336633 2283 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 29 16:12:36.349952 kubelet[2283]: I0129 16:12:36.349511 2283 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 29 16:12:36.349952 kubelet[2283]: I0129 16:12:36.349538 2283 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 29 16:12:36.349952 kubelet[2283]: I0129 16:12:36.349572 2283 state_mem.go:36] "Initialized new in-memory state store" Jan 29 16:12:36.351718 kubelet[2283]: I0129 16:12:36.351691 2283 policy_none.go:49] "None policy: Start" Jan 29 16:12:36.352861 kubelet[2283]: I0129 16:12:36.352829 2283 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 29 16:12:36.353056 kubelet[2283]: I0129 16:12:36.353016 2283 state_mem.go:35] "Initializing new in-memory state store" Jan 29 16:12:36.364938 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 29 16:12:36.379154 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 29 16:12:36.384820 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 29 16:12:36.387895 kubelet[2283]: E0129 16:12:36.387812 2283 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-6bdnt.gb1.brightbox.com\" not found" Jan 29 16:12:36.395891 kubelet[2283]: I0129 16:12:36.395589 2283 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 29 16:12:36.396091 kubelet[2283]: I0129 16:12:36.396068 2283 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 29 16:12:36.396377 kubelet[2283]: I0129 16:12:36.396325 2283 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 29 16:12:36.397500 kubelet[2283]: I0129 16:12:36.397475 2283 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 29 16:12:36.400675 kubelet[2283]: E0129 16:12:36.400404 2283 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-6bdnt.gb1.brightbox.com\" not found" Jan 29 16:12:36.441551 systemd[1]: Created slice kubepods-burstable-pod36d445239a8c69a871c5cad041caa9eb.slice - libcontainer container kubepods-burstable-pod36d445239a8c69a871c5cad041caa9eb.slice. Jan 29 16:12:36.460465 systemd[1]: Created slice kubepods-burstable-podc5bfd1e625f2046b2aa5b748a0b82e00.slice - libcontainer container kubepods-burstable-podc5bfd1e625f2046b2aa5b748a0b82e00.slice. Jan 29 16:12:36.476732 systemd[1]: Created slice kubepods-burstable-pod37a0a7eff3707086dc40563ec03e9a1b.slice - libcontainer container kubepods-burstable-pod37a0a7eff3707086dc40563ec03e9a1b.slice. Jan 29 16:12:36.489028 kubelet[2283]: I0129 16:12:36.488543 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-k8s-certs\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489028 kubelet[2283]: I0129 16:12:36.488614 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489028 kubelet[2283]: I0129 16:12:36.488654 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/36d445239a8c69a871c5cad041caa9eb-ca-certs\") pod \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" (UID: \"36d445239a8c69a871c5cad041caa9eb\") " pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489028 kubelet[2283]: I0129 16:12:36.488680 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/36d445239a8c69a871c5cad041caa9eb-k8s-certs\") pod \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" (UID: \"36d445239a8c69a871c5cad041caa9eb\") " pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489028 kubelet[2283]: I0129 16:12:36.488726 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/36d445239a8c69a871c5cad041caa9eb-usr-share-ca-certificates\") pod \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" (UID: \"36d445239a8c69a871c5cad041caa9eb\") " pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489497 kubelet[2283]: I0129 16:12:36.488756 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-ca-certs\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489497 kubelet[2283]: I0129 16:12:36.488804 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-flexvolume-dir\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489497 kubelet[2283]: I0129 16:12:36.488855 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-kubeconfig\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489497 kubelet[2283]: I0129 16:12:36.488884 2283 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/37a0a7eff3707086dc40563ec03e9a1b-kubeconfig\") pod \"kube-scheduler-srv-6bdnt.gb1.brightbox.com\" (UID: \"37a0a7eff3707086dc40563ec03e9a1b\") " pod="kube-system/kube-scheduler-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.489497 kubelet[2283]: E0129 16:12:36.489126 2283 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.37.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6bdnt.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.37.146:6443: connect: connection refused" interval="400ms" Jan 29 16:12:36.501003 kubelet[2283]: I0129 16:12:36.500961 2283 kubelet_node_status.go:72] "Attempting to register node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.501498 kubelet[2283]: E0129 16:12:36.501466 2283 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.37.146:6443/api/v1/nodes\": dial tcp 10.230.37.146:6443: connect: connection refused" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.705505 kubelet[2283]: I0129 16:12:36.705357 2283 kubelet_node_status.go:72] "Attempting to register node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.706035 kubelet[2283]: E0129 16:12:36.705835 2283 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.37.146:6443/api/v1/nodes\": dial tcp 10.230.37.146:6443: connect: connection refused" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:36.760237 containerd[1493]: time="2025-01-29T16:12:36.760072515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-6bdnt.gb1.brightbox.com,Uid:36d445239a8c69a871c5cad041caa9eb,Namespace:kube-system,Attempt:0,}" Jan 29 16:12:36.774393 containerd[1493]: time="2025-01-29T16:12:36.774237022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-6bdnt.gb1.brightbox.com,Uid:c5bfd1e625f2046b2aa5b748a0b82e00,Namespace:kube-system,Attempt:0,}" Jan 29 16:12:36.787451 containerd[1493]: time="2025-01-29T16:12:36.787406381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-6bdnt.gb1.brightbox.com,Uid:37a0a7eff3707086dc40563ec03e9a1b,Namespace:kube-system,Attempt:0,}" Jan 29 16:12:36.890434 kubelet[2283]: E0129 16:12:36.890367 2283 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.37.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6bdnt.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.37.146:6443: connect: connection refused" interval="800ms" Jan 29 16:12:37.109249 kubelet[2283]: I0129 16:12:37.109210 2283 kubelet_node_status.go:72] "Attempting to register node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:37.109811 kubelet[2283]: E0129 16:12:37.109773 2283 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.37.146:6443/api/v1/nodes\": dial tcp 10.230.37.146:6443: connect: connection refused" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:37.184318 kubelet[2283]: W0129 16:12:37.184252 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.37.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:37.184420 kubelet[2283]: E0129 16:12:37.184349 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.37.146:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:37.330575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2655301692.mount: Deactivated successfully. Jan 29 16:12:37.338795 containerd[1493]: time="2025-01-29T16:12:37.338749970Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 16:12:37.340051 containerd[1493]: time="2025-01-29T16:12:37.340012598Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 16:12:37.341200 containerd[1493]: time="2025-01-29T16:12:37.341110025Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Jan 29 16:12:37.341436 containerd[1493]: time="2025-01-29T16:12:37.341395866Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 29 16:12:37.342099 containerd[1493]: time="2025-01-29T16:12:37.342063491Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 16:12:37.343157 containerd[1493]: time="2025-01-29T16:12:37.343099203Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 29 16:12:37.344198 containerd[1493]: time="2025-01-29T16:12:37.343899643Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 16:12:37.348486 containerd[1493]: time="2025-01-29T16:12:37.348451961Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 29 16:12:37.350998 containerd[1493]: time="2025-01-29T16:12:37.350939533Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 563.210603ms" Jan 29 16:12:37.355529 containerd[1493]: time="2025-01-29T16:12:37.355413660Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 581.042411ms" Jan 29 16:12:37.361194 containerd[1493]: time="2025-01-29T16:12:37.359987465Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 599.125383ms" Jan 29 16:12:37.442977 kubelet[2283]: W0129 16:12:37.442841 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.37.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:37.442977 kubelet[2283]: E0129 16:12:37.442943 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.37.146:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:37.561521 kubelet[2283]: W0129 16:12:37.552816 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.37.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6bdnt.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:37.561521 kubelet[2283]: E0129 16:12:37.552923 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.37.146:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-6bdnt.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:37.614289 containerd[1493]: time="2025-01-29T16:12:37.612301691Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:37.614289 containerd[1493]: time="2025-01-29T16:12:37.612422841Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:37.614289 containerd[1493]: time="2025-01-29T16:12:37.612441244Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:37.614289 containerd[1493]: time="2025-01-29T16:12:37.612567898Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:37.614644 containerd[1493]: time="2025-01-29T16:12:37.613016845Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:37.614644 containerd[1493]: time="2025-01-29T16:12:37.613076935Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:37.614644 containerd[1493]: time="2025-01-29T16:12:37.613100349Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:37.614644 containerd[1493]: time="2025-01-29T16:12:37.613221198Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:37.621739 containerd[1493]: time="2025-01-29T16:12:37.621639433Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:37.621858 containerd[1493]: time="2025-01-29T16:12:37.621699184Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:37.621858 containerd[1493]: time="2025-01-29T16:12:37.621733555Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:37.622042 containerd[1493]: time="2025-01-29T16:12:37.621864288Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:37.661254 systemd[1]: Started cri-containerd-a6321af46c5ca30b641e48cd2383915d97cebca4cf54e86ae7f99e37f8fdde96.scope - libcontainer container a6321af46c5ca30b641e48cd2383915d97cebca4cf54e86ae7f99e37f8fdde96. Jan 29 16:12:37.670428 systemd[1]: Started cri-containerd-9c37ab171cdadb49445624d0c6dfcac1821322a1ae8e6719303cd74dc81d4b13.scope - libcontainer container 9c37ab171cdadb49445624d0c6dfcac1821322a1ae8e6719303cd74dc81d4b13. Jan 29 16:12:37.675362 systemd[1]: Started cri-containerd-d9f8cfd3bf555f723cbfed8bba282c81f9dc7667ad09f70fbca7a5fae2131761.scope - libcontainer container d9f8cfd3bf555f723cbfed8bba282c81f9dc7667ad09f70fbca7a5fae2131761. Jan 29 16:12:37.693578 kubelet[2283]: E0129 16:12:37.693481 2283 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.37.146:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-6bdnt.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.37.146:6443: connect: connection refused" interval="1.6s" Jan 29 16:12:37.713256 kubelet[2283]: W0129 16:12:37.713150 2283 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.37.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.37.146:6443: connect: connection refused Jan 29 16:12:37.713365 kubelet[2283]: E0129 16:12:37.713271 2283 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.37.146:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:37.775405 containerd[1493]: time="2025-01-29T16:12:37.775129488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-6bdnt.gb1.brightbox.com,Uid:36d445239a8c69a871c5cad041caa9eb,Namespace:kube-system,Attempt:0,} returns sandbox id \"a6321af46c5ca30b641e48cd2383915d97cebca4cf54e86ae7f99e37f8fdde96\"" Jan 29 16:12:37.785867 containerd[1493]: time="2025-01-29T16:12:37.784270514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-6bdnt.gb1.brightbox.com,Uid:c5bfd1e625f2046b2aa5b748a0b82e00,Namespace:kube-system,Attempt:0,} returns sandbox id \"9c37ab171cdadb49445624d0c6dfcac1821322a1ae8e6719303cd74dc81d4b13\"" Jan 29 16:12:37.790406 containerd[1493]: time="2025-01-29T16:12:37.789863188Z" level=info msg="CreateContainer within sandbox \"a6321af46c5ca30b641e48cd2383915d97cebca4cf54e86ae7f99e37f8fdde96\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 29 16:12:37.792101 containerd[1493]: time="2025-01-29T16:12:37.792063910Z" level=info msg="CreateContainer within sandbox \"9c37ab171cdadb49445624d0c6dfcac1821322a1ae8e6719303cd74dc81d4b13\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 29 16:12:37.805902 containerd[1493]: time="2025-01-29T16:12:37.805845696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-6bdnt.gb1.brightbox.com,Uid:37a0a7eff3707086dc40563ec03e9a1b,Namespace:kube-system,Attempt:0,} returns sandbox id \"d9f8cfd3bf555f723cbfed8bba282c81f9dc7667ad09f70fbca7a5fae2131761\"" Jan 29 16:12:37.811226 containerd[1493]: time="2025-01-29T16:12:37.811133798Z" level=info msg="CreateContainer within sandbox \"d9f8cfd3bf555f723cbfed8bba282c81f9dc7667ad09f70fbca7a5fae2131761\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 29 16:12:37.835145 containerd[1493]: time="2025-01-29T16:12:37.835087418Z" level=info msg="CreateContainer within sandbox \"a6321af46c5ca30b641e48cd2383915d97cebca4cf54e86ae7f99e37f8fdde96\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b3708e6450294458e1b2dd7efcc66bf0a15f5f084e1eec398ce5e75133fbeeb9\"" Jan 29 16:12:37.836076 containerd[1493]: time="2025-01-29T16:12:37.835920267Z" level=info msg="StartContainer for \"b3708e6450294458e1b2dd7efcc66bf0a15f5f084e1eec398ce5e75133fbeeb9\"" Jan 29 16:12:37.841379 containerd[1493]: time="2025-01-29T16:12:37.841336885Z" level=info msg="CreateContainer within sandbox \"9c37ab171cdadb49445624d0c6dfcac1821322a1ae8e6719303cd74dc81d4b13\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b3a742e210eb103afd2d8c696762f991f51c4344c2584f5611e6cac771dea41f\"" Jan 29 16:12:37.841940 containerd[1493]: time="2025-01-29T16:12:37.841905107Z" level=info msg="StartContainer for \"b3a742e210eb103afd2d8c696762f991f51c4344c2584f5611e6cac771dea41f\"" Jan 29 16:12:37.860659 containerd[1493]: time="2025-01-29T16:12:37.860453138Z" level=info msg="CreateContainer within sandbox \"d9f8cfd3bf555f723cbfed8bba282c81f9dc7667ad09f70fbca7a5fae2131761\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"eb1a26f6c152e2c69abd341228e61dec9051fc8208fceebc95919e046f2bc517\"" Jan 29 16:12:37.861215 containerd[1493]: time="2025-01-29T16:12:37.861159271Z" level=info msg="StartContainer for \"eb1a26f6c152e2c69abd341228e61dec9051fc8208fceebc95919e046f2bc517\"" Jan 29 16:12:37.881452 systemd[1]: Started cri-containerd-b3708e6450294458e1b2dd7efcc66bf0a15f5f084e1eec398ce5e75133fbeeb9.scope - libcontainer container b3708e6450294458e1b2dd7efcc66bf0a15f5f084e1eec398ce5e75133fbeeb9. Jan 29 16:12:37.902363 systemd[1]: Started cri-containerd-b3a742e210eb103afd2d8c696762f991f51c4344c2584f5611e6cac771dea41f.scope - libcontainer container b3a742e210eb103afd2d8c696762f991f51c4344c2584f5611e6cac771dea41f. Jan 29 16:12:37.920453 kubelet[2283]: I0129 16:12:37.920065 2283 kubelet_node_status.go:72] "Attempting to register node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:37.920885 kubelet[2283]: E0129 16:12:37.920546 2283 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.37.146:6443/api/v1/nodes\": dial tcp 10.230.37.146:6443: connect: connection refused" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:37.929376 systemd[1]: Started cri-containerd-eb1a26f6c152e2c69abd341228e61dec9051fc8208fceebc95919e046f2bc517.scope - libcontainer container eb1a26f6c152e2c69abd341228e61dec9051fc8208fceebc95919e046f2bc517. Jan 29 16:12:38.017412 containerd[1493]: time="2025-01-29T16:12:38.017354835Z" level=info msg="StartContainer for \"b3708e6450294458e1b2dd7efcc66bf0a15f5f084e1eec398ce5e75133fbeeb9\" returns successfully" Jan 29 16:12:38.017659 containerd[1493]: time="2025-01-29T16:12:38.017361559Z" level=info msg="StartContainer for \"b3a742e210eb103afd2d8c696762f991f51c4344c2584f5611e6cac771dea41f\" returns successfully" Jan 29 16:12:38.057247 containerd[1493]: time="2025-01-29T16:12:38.056413580Z" level=info msg="StartContainer for \"eb1a26f6c152e2c69abd341228e61dec9051fc8208fceebc95919e046f2bc517\" returns successfully" Jan 29 16:12:38.339365 kubelet[2283]: E0129 16:12:38.337100 2283 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.37.146:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.37.146:6443: connect: connection refused" logger="UnhandledError" Jan 29 16:12:39.524506 kubelet[2283]: I0129 16:12:39.524443 2283 kubelet_node_status.go:72] "Attempting to register node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:40.893474 kubelet[2283]: E0129 16:12:40.893364 2283 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-6bdnt.gb1.brightbox.com\" not found" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:40.916586 kubelet[2283]: I0129 16:12:40.916532 2283 kubelet_node_status.go:75] "Successfully registered node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:40.916767 kubelet[2283]: E0129 16:12:40.916594 2283 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"srv-6bdnt.gb1.brightbox.com\": node \"srv-6bdnt.gb1.brightbox.com\" not found" Jan 29 16:12:40.979011 kubelet[2283]: E0129 16:12:40.978807 2283 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{srv-6bdnt.gb1.brightbox.com.181f35d1c55b2f48 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-6bdnt.gb1.brightbox.com,UID:srv-6bdnt.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-6bdnt.gb1.brightbox.com,},FirstTimestamp:2025-01-29 16:12:36.263825224 +0000 UTC m=+0.620910122,LastTimestamp:2025-01-29 16:12:36.263825224 +0000 UTC m=+0.620910122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-6bdnt.gb1.brightbox.com,}" Jan 29 16:12:41.037144 kubelet[2283]: E0129 16:12:41.036996 2283 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{srv-6bdnt.gb1.brightbox.com.181f35d1c9b1e144 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-6bdnt.gb1.brightbox.com,UID:srv-6bdnt.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:srv-6bdnt.gb1.brightbox.com,},FirstTimestamp:2025-01-29 16:12:36.336615748 +0000 UTC m=+0.693700636,LastTimestamp:2025-01-29 16:12:36.336615748 +0000 UTC m=+0.693700636,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-6bdnt.gb1.brightbox.com,}" Jan 29 16:12:41.256626 kubelet[2283]: I0129 16:12:41.256525 2283 apiserver.go:52] "Watching apiserver" Jan 29 16:12:41.288061 kubelet[2283]: I0129 16:12:41.287964 2283 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 29 16:12:43.161597 systemd[1]: Reloading requested from client PID 2560 ('systemctl') (unit session-11.scope)... Jan 29 16:12:43.162269 systemd[1]: Reloading... Jan 29 16:12:43.287238 zram_generator::config[2596]: No configuration found. Jan 29 16:12:43.479301 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 29 16:12:43.616935 systemd[1]: Reloading finished in 454 ms. Jan 29 16:12:43.683020 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:43.699159 systemd[1]: kubelet.service: Deactivated successfully. Jan 29 16:12:43.699647 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:43.699784 systemd[1]: kubelet.service: Consumed 1.176s CPU time, 115.3M memory peak, 0B memory swap peak. Jan 29 16:12:43.714043 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 29 16:12:43.942468 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 29 16:12:43.959036 (kubelet)[2663]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 29 16:12:44.054912 kubelet[2663]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 16:12:44.054912 kubelet[2663]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 29 16:12:44.054912 kubelet[2663]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 29 16:12:44.055682 kubelet[2663]: I0129 16:12:44.054994 2663 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 29 16:12:44.064272 kubelet[2663]: I0129 16:12:44.064243 2663 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Jan 29 16:12:44.064272 kubelet[2663]: I0129 16:12:44.064271 2663 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 29 16:12:44.064645 kubelet[2663]: I0129 16:12:44.064598 2663 server.go:929] "Client rotation is on, will bootstrap in background" Jan 29 16:12:44.066757 kubelet[2663]: I0129 16:12:44.066549 2663 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 29 16:12:44.081757 kubelet[2663]: I0129 16:12:44.081375 2663 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 29 16:12:44.087887 kubelet[2663]: E0129 16:12:44.087769 2663 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Jan 29 16:12:44.087887 kubelet[2663]: I0129 16:12:44.087883 2663 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Jan 29 16:12:44.095056 kubelet[2663]: I0129 16:12:44.094879 2663 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 29 16:12:44.096048 kubelet[2663]: I0129 16:12:44.096000 2663 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 29 16:12:44.096622 kubelet[2663]: I0129 16:12:44.096556 2663 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 29 16:12:44.096914 kubelet[2663]: I0129 16:12:44.096620 2663 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-6bdnt.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 29 16:12:44.097094 kubelet[2663]: I0129 16:12:44.096937 2663 topology_manager.go:138] "Creating topology manager with none policy" Jan 29 16:12:44.097094 kubelet[2663]: I0129 16:12:44.096955 2663 container_manager_linux.go:300] "Creating device plugin manager" Jan 29 16:12:44.097094 kubelet[2663]: I0129 16:12:44.097039 2663 state_mem.go:36] "Initialized new in-memory state store" Jan 29 16:12:44.097481 kubelet[2663]: I0129 16:12:44.097240 2663 kubelet.go:408] "Attempting to sync node with API server" Jan 29 16:12:44.097481 kubelet[2663]: I0129 16:12:44.097288 2663 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 29 16:12:44.097481 kubelet[2663]: I0129 16:12:44.097354 2663 kubelet.go:314] "Adding apiserver pod source" Jan 29 16:12:44.097481 kubelet[2663]: I0129 16:12:44.097380 2663 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 29 16:12:44.107749 kubelet[2663]: I0129 16:12:44.104364 2663 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 29 16:12:44.107749 kubelet[2663]: I0129 16:12:44.104943 2663 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 29 16:12:44.107749 kubelet[2663]: I0129 16:12:44.105765 2663 server.go:1269] "Started kubelet" Jan 29 16:12:44.113019 kubelet[2663]: I0129 16:12:44.112974 2663 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 29 16:12:44.124345 kubelet[2663]: I0129 16:12:44.124301 2663 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 29 16:12:44.125912 kubelet[2663]: I0129 16:12:44.125887 2663 server.go:460] "Adding debug handlers to kubelet server" Jan 29 16:12:44.132245 kubelet[2663]: I0129 16:12:44.129481 2663 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 29 16:12:44.133783 kubelet[2663]: I0129 16:12:44.133758 2663 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 29 16:12:44.134206 kubelet[2663]: I0129 16:12:44.134182 2663 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 29 16:12:44.141281 kubelet[2663]: I0129 16:12:44.140809 2663 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 29 16:12:44.142296 kubelet[2663]: I0129 16:12:44.142272 2663 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 29 16:12:44.143532 kubelet[2663]: I0129 16:12:44.143511 2663 reconciler.go:26] "Reconciler: start to sync state" Jan 29 16:12:44.153697 kubelet[2663]: I0129 16:12:44.153664 2663 factory.go:221] Registration of the systemd container factory successfully Jan 29 16:12:44.153982 kubelet[2663]: I0129 16:12:44.153952 2663 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 29 16:12:44.157585 kubelet[2663]: E0129 16:12:44.156935 2663 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 29 16:12:44.158039 kubelet[2663]: I0129 16:12:44.158015 2663 factory.go:221] Registration of the containerd container factory successfully Jan 29 16:12:44.167374 kubelet[2663]: I0129 16:12:44.166830 2663 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 29 16:12:44.168455 kubelet[2663]: I0129 16:12:44.168417 2663 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 29 16:12:44.168518 kubelet[2663]: I0129 16:12:44.168474 2663 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 29 16:12:44.168518 kubelet[2663]: I0129 16:12:44.168515 2663 kubelet.go:2321] "Starting kubelet main sync loop" Jan 29 16:12:44.168637 kubelet[2663]: E0129 16:12:44.168599 2663 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 29 16:12:44.253662 kubelet[2663]: I0129 16:12:44.253609 2663 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 29 16:12:44.253662 kubelet[2663]: I0129 16:12:44.253640 2663 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 29 16:12:44.253662 kubelet[2663]: I0129 16:12:44.253673 2663 state_mem.go:36] "Initialized new in-memory state store" Jan 29 16:12:44.254031 kubelet[2663]: I0129 16:12:44.253926 2663 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 29 16:12:44.254031 kubelet[2663]: I0129 16:12:44.253951 2663 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 29 16:12:44.254031 kubelet[2663]: I0129 16:12:44.253986 2663 policy_none.go:49] "None policy: Start" Jan 29 16:12:44.255149 kubelet[2663]: I0129 16:12:44.254909 2663 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 29 16:12:44.255149 kubelet[2663]: I0129 16:12:44.255016 2663 state_mem.go:35] "Initializing new in-memory state store" Jan 29 16:12:44.255927 kubelet[2663]: I0129 16:12:44.255425 2663 state_mem.go:75] "Updated machine memory state" Jan 29 16:12:44.263756 kubelet[2663]: I0129 16:12:44.263434 2663 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 29 16:12:44.263756 kubelet[2663]: I0129 16:12:44.263704 2663 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 29 16:12:44.263880 kubelet[2663]: I0129 16:12:44.263749 2663 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 29 16:12:44.268266 kubelet[2663]: I0129 16:12:44.267654 2663 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 29 16:12:44.306652 kubelet[2663]: W0129 16:12:44.306382 2663 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 29 16:12:44.308328 kubelet[2663]: W0129 16:12:44.307732 2663 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 29 16:12:44.310725 kubelet[2663]: W0129 16:12:44.310623 2663 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 29 16:12:44.345598 kubelet[2663]: I0129 16:12:44.345394 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-flexvolume-dir\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.345598 kubelet[2663]: I0129 16:12:44.345484 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-kubeconfig\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.346373 kubelet[2663]: I0129 16:12:44.345975 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.346373 kubelet[2663]: I0129 16:12:44.346021 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/37a0a7eff3707086dc40563ec03e9a1b-kubeconfig\") pod \"kube-scheduler-srv-6bdnt.gb1.brightbox.com\" (UID: \"37a0a7eff3707086dc40563ec03e9a1b\") " pod="kube-system/kube-scheduler-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.346373 kubelet[2663]: I0129 16:12:44.346075 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/36d445239a8c69a871c5cad041caa9eb-k8s-certs\") pod \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" (UID: \"36d445239a8c69a871c5cad041caa9eb\") " pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.346373 kubelet[2663]: I0129 16:12:44.346228 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-ca-certs\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.346373 kubelet[2663]: I0129 16:12:44.346302 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5bfd1e625f2046b2aa5b748a0b82e00-k8s-certs\") pod \"kube-controller-manager-srv-6bdnt.gb1.brightbox.com\" (UID: \"c5bfd1e625f2046b2aa5b748a0b82e00\") " pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.346721 kubelet[2663]: I0129 16:12:44.346341 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/36d445239a8c69a871c5cad041caa9eb-ca-certs\") pod \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" (UID: \"36d445239a8c69a871c5cad041caa9eb\") " pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.347440 kubelet[2663]: I0129 16:12:44.346936 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/36d445239a8c69a871c5cad041caa9eb-usr-share-ca-certificates\") pod \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" (UID: \"36d445239a8c69a871c5cad041caa9eb\") " pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.397841 kubelet[2663]: I0129 16:12:44.397609 2663 kubelet_node_status.go:72] "Attempting to register node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.411893 kubelet[2663]: I0129 16:12:44.411857 2663 kubelet_node_status.go:111] "Node was previously registered" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:44.412010 kubelet[2663]: I0129 16:12:44.411978 2663 kubelet_node_status.go:75] "Successfully registered node" node="srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:45.110688 kubelet[2663]: I0129 16:12:45.109153 2663 apiserver.go:52] "Watching apiserver" Jan 29 16:12:45.143600 kubelet[2663]: I0129 16:12:45.143387 2663 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 29 16:12:45.233274 kubelet[2663]: W0129 16:12:45.233229 2663 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 29 16:12:45.233490 kubelet[2663]: E0129 16:12:45.233465 2663 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-6bdnt.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" Jan 29 16:12:45.263918 kubelet[2663]: I0129 16:12:45.263728 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-6bdnt.gb1.brightbox.com" podStartSLOduration=1.263693717 podStartE2EDuration="1.263693717s" podCreationTimestamp="2025-01-29 16:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:12:45.262740532 +0000 UTC m=+1.282058344" watchObservedRunningTime="2025-01-29 16:12:45.263693717 +0000 UTC m=+1.283011521" Jan 29 16:12:45.315303 kubelet[2663]: I0129 16:12:45.315108 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-6bdnt.gb1.brightbox.com" podStartSLOduration=1.315084927 podStartE2EDuration="1.315084927s" podCreationTimestamp="2025-01-29 16:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:12:45.299439854 +0000 UTC m=+1.318757659" watchObservedRunningTime="2025-01-29 16:12:45.315084927 +0000 UTC m=+1.334402732" Jan 29 16:12:45.341523 kubelet[2663]: I0129 16:12:45.341101 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-6bdnt.gb1.brightbox.com" podStartSLOduration=1.341079503 podStartE2EDuration="1.341079503s" podCreationTimestamp="2025-01-29 16:12:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:12:45.316775005 +0000 UTC m=+1.336092827" watchObservedRunningTime="2025-01-29 16:12:45.341079503 +0000 UTC m=+1.360397295" Jan 29 16:12:49.277747 kubelet[2663]: I0129 16:12:49.275046 2663 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 29 16:12:49.277747 kubelet[2663]: I0129 16:12:49.278284 2663 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 29 16:12:49.284969 containerd[1493]: time="2025-01-29T16:12:49.277906569Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 29 16:12:49.781748 kubelet[2663]: I0129 16:12:49.781351 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f2925e7c-ecec-4b29-a4dd-0c21753970c1-kube-proxy\") pod \"kube-proxy-xs5zw\" (UID: \"f2925e7c-ecec-4b29-a4dd-0c21753970c1\") " pod="kube-system/kube-proxy-xs5zw" Jan 29 16:12:49.781748 kubelet[2663]: I0129 16:12:49.781413 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f2925e7c-ecec-4b29-a4dd-0c21753970c1-xtables-lock\") pod \"kube-proxy-xs5zw\" (UID: \"f2925e7c-ecec-4b29-a4dd-0c21753970c1\") " pod="kube-system/kube-proxy-xs5zw" Jan 29 16:12:49.785256 kubelet[2663]: I0129 16:12:49.781453 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq9rx\" (UniqueName: \"kubernetes.io/projected/f2925e7c-ecec-4b29-a4dd-0c21753970c1-kube-api-access-tq9rx\") pod \"kube-proxy-xs5zw\" (UID: \"f2925e7c-ecec-4b29-a4dd-0c21753970c1\") " pod="kube-system/kube-proxy-xs5zw" Jan 29 16:12:49.785256 kubelet[2663]: I0129 16:12:49.782448 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2925e7c-ecec-4b29-a4dd-0c21753970c1-lib-modules\") pod \"kube-proxy-xs5zw\" (UID: \"f2925e7c-ecec-4b29-a4dd-0c21753970c1\") " pod="kube-system/kube-proxy-xs5zw" Jan 29 16:12:49.794667 systemd[1]: Created slice kubepods-besteffort-podf2925e7c_ecec_4b29_a4dd_0c21753970c1.slice - libcontainer container kubepods-besteffort-podf2925e7c_ecec_4b29_a4dd_0c21753970c1.slice. Jan 29 16:12:49.894926 kubelet[2663]: E0129 16:12:49.894449 2663 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jan 29 16:12:49.894926 kubelet[2663]: E0129 16:12:49.894518 2663 projected.go:194] Error preparing data for projected volume kube-api-access-tq9rx for pod kube-system/kube-proxy-xs5zw: configmap "kube-root-ca.crt" not found Jan 29 16:12:49.894926 kubelet[2663]: E0129 16:12:49.894671 2663 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f2925e7c-ecec-4b29-a4dd-0c21753970c1-kube-api-access-tq9rx podName:f2925e7c-ecec-4b29-a4dd-0c21753970c1 nodeName:}" failed. No retries permitted until 2025-01-29 16:12:50.394621254 +0000 UTC m=+6.413939052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tq9rx" (UniqueName: "kubernetes.io/projected/f2925e7c-ecec-4b29-a4dd-0c21753970c1-kube-api-access-tq9rx") pod "kube-proxy-xs5zw" (UID: "f2925e7c-ecec-4b29-a4dd-0c21753970c1") : configmap "kube-root-ca.crt" not found Jan 29 16:12:50.382007 systemd[1]: Created slice kubepods-besteffort-pod1f86af95_cdb7_4243_8c19_b5d7798f937c.slice - libcontainer container kubepods-besteffort-pod1f86af95_cdb7_4243_8c19_b5d7798f937c.slice. Jan 29 16:12:50.386330 kubelet[2663]: I0129 16:12:50.386205 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1f86af95-cdb7-4243-8c19-b5d7798f937c-var-lib-calico\") pod \"tigera-operator-76c4976dd7-bx8fr\" (UID: \"1f86af95-cdb7-4243-8c19-b5d7798f937c\") " pod="tigera-operator/tigera-operator-76c4976dd7-bx8fr" Jan 29 16:12:50.386330 kubelet[2663]: I0129 16:12:50.386268 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5mj\" (UniqueName: \"kubernetes.io/projected/1f86af95-cdb7-4243-8c19-b5d7798f937c-kube-api-access-hm5mj\") pod \"tigera-operator-76c4976dd7-bx8fr\" (UID: \"1f86af95-cdb7-4243-8c19-b5d7798f937c\") " pod="tigera-operator/tigera-operator-76c4976dd7-bx8fr" Jan 29 16:12:50.671221 sudo[1772]: pam_unix(sudo:session): session closed for user root Jan 29 16:12:50.690449 containerd[1493]: time="2025-01-29T16:12:50.689280271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-bx8fr,Uid:1f86af95-cdb7-4243-8c19-b5d7798f937c,Namespace:tigera-operator,Attempt:0,}" Jan 29 16:12:50.708878 containerd[1493]: time="2025-01-29T16:12:50.708835393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xs5zw,Uid:f2925e7c-ecec-4b29-a4dd-0c21753970c1,Namespace:kube-system,Attempt:0,}" Jan 29 16:12:50.741995 containerd[1493]: time="2025-01-29T16:12:50.741858006Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:50.741995 containerd[1493]: time="2025-01-29T16:12:50.741949819Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:50.742922 containerd[1493]: time="2025-01-29T16:12:50.742544319Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:50.742922 containerd[1493]: time="2025-01-29T16:12:50.742773157Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:50.760431 containerd[1493]: time="2025-01-29T16:12:50.760311826Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:50.761783 containerd[1493]: time="2025-01-29T16:12:50.761673561Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:50.762053 containerd[1493]: time="2025-01-29T16:12:50.761832099Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:50.762355 containerd[1493]: time="2025-01-29T16:12:50.762119157Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:50.794677 systemd[1]: Started cri-containerd-e41251375f9d76c896a3df5445495c74ae7abc8dc9628df53c3db64eef74ee9e.scope - libcontainer container e41251375f9d76c896a3df5445495c74ae7abc8dc9628df53c3db64eef74ee9e. Jan 29 16:12:50.818677 sshd[1769]: pam_unix(sshd:session): session closed for user core Jan 29 16:12:50.821388 systemd[1]: Started cri-containerd-eadf1d847eeee6aa39cbe3ef35daaeeb4b41b7ecd6a219ccd3331bdfc8c7bda7.scope - libcontainer container eadf1d847eeee6aa39cbe3ef35daaeeb4b41b7ecd6a219ccd3331bdfc8c7bda7. Jan 29 16:12:50.825212 systemd[1]: sshd@8-10.230.37.146:22-139.178.68.195:49490.service: Deactivated successfully. Jan 29 16:12:50.825744 systemd-logind[1482]: Session 11 logged out. Waiting for processes to exit. Jan 29 16:12:50.833877 systemd[1]: session-11.scope: Deactivated successfully. Jan 29 16:12:50.834502 systemd[1]: session-11.scope: Consumed 7.061s CPU time, 140.9M memory peak, 0B memory swap peak. Jan 29 16:12:50.842435 systemd-logind[1482]: Removed session 11. Jan 29 16:12:50.874002 containerd[1493]: time="2025-01-29T16:12:50.873798575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xs5zw,Uid:f2925e7c-ecec-4b29-a4dd-0c21753970c1,Namespace:kube-system,Attempt:0,} returns sandbox id \"eadf1d847eeee6aa39cbe3ef35daaeeb4b41b7ecd6a219ccd3331bdfc8c7bda7\"" Jan 29 16:12:50.881890 containerd[1493]: time="2025-01-29T16:12:50.881724813Z" level=info msg="CreateContainer within sandbox \"eadf1d847eeee6aa39cbe3ef35daaeeb4b41b7ecd6a219ccd3331bdfc8c7bda7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 29 16:12:50.901976 containerd[1493]: time="2025-01-29T16:12:50.901900183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-bx8fr,Uid:1f86af95-cdb7-4243-8c19-b5d7798f937c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e41251375f9d76c896a3df5445495c74ae7abc8dc9628df53c3db64eef74ee9e\"" Jan 29 16:12:50.904674 containerd[1493]: time="2025-01-29T16:12:50.904481314Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 29 16:12:50.922860 containerd[1493]: time="2025-01-29T16:12:50.922251599Z" level=info msg="CreateContainer within sandbox \"eadf1d847eeee6aa39cbe3ef35daaeeb4b41b7ecd6a219ccd3331bdfc8c7bda7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"035d2e54f72dee3c6e7fbbcd0833464f18547e1411f76df87ba0ef5e45b7c2ba\"" Jan 29 16:12:50.926519 containerd[1493]: time="2025-01-29T16:12:50.926007686Z" level=info msg="StartContainer for \"035d2e54f72dee3c6e7fbbcd0833464f18547e1411f76df87ba0ef5e45b7c2ba\"" Jan 29 16:12:50.972421 systemd[1]: Started cri-containerd-035d2e54f72dee3c6e7fbbcd0833464f18547e1411f76df87ba0ef5e45b7c2ba.scope - libcontainer container 035d2e54f72dee3c6e7fbbcd0833464f18547e1411f76df87ba0ef5e45b7c2ba. Jan 29 16:12:51.018971 containerd[1493]: time="2025-01-29T16:12:51.018415300Z" level=info msg="StartContainer for \"035d2e54f72dee3c6e7fbbcd0833464f18547e1411f76df87ba0ef5e45b7c2ba\" returns successfully" Jan 29 16:12:52.853877 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2662982978.mount: Deactivated successfully. Jan 29 16:12:53.785251 containerd[1493]: time="2025-01-29T16:12:53.785160928Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:53.786700 containerd[1493]: time="2025-01-29T16:12:53.786639213Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Jan 29 16:12:53.787896 containerd[1493]: time="2025-01-29T16:12:53.787329514Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:53.790616 containerd[1493]: time="2025-01-29T16:12:53.790562185Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:12:53.792223 containerd[1493]: time="2025-01-29T16:12:53.792132832Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 2.88758725s" Jan 29 16:12:53.792368 containerd[1493]: time="2025-01-29T16:12:53.792330864Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 29 16:12:53.796366 containerd[1493]: time="2025-01-29T16:12:53.796318057Z" level=info msg="CreateContainer within sandbox \"e41251375f9d76c896a3df5445495c74ae7abc8dc9628df53c3db64eef74ee9e\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 29 16:12:53.817995 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3369848273.mount: Deactivated successfully. Jan 29 16:12:53.823293 containerd[1493]: time="2025-01-29T16:12:53.821874633Z" level=info msg="CreateContainer within sandbox \"e41251375f9d76c896a3df5445495c74ae7abc8dc9628df53c3db64eef74ee9e\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b9f9a8fd79890bb3001990cc3c203a28f28bde4a3dc40ea2db52abb4ed30e051\"" Jan 29 16:12:53.825292 containerd[1493]: time="2025-01-29T16:12:53.824896672Z" level=info msg="StartContainer for \"b9f9a8fd79890bb3001990cc3c203a28f28bde4a3dc40ea2db52abb4ed30e051\"" Jan 29 16:12:53.883489 systemd[1]: Started cri-containerd-b9f9a8fd79890bb3001990cc3c203a28f28bde4a3dc40ea2db52abb4ed30e051.scope - libcontainer container b9f9a8fd79890bb3001990cc3c203a28f28bde4a3dc40ea2db52abb4ed30e051. Jan 29 16:12:53.923860 containerd[1493]: time="2025-01-29T16:12:53.923774518Z" level=info msg="StartContainer for \"b9f9a8fd79890bb3001990cc3c203a28f28bde4a3dc40ea2db52abb4ed30e051\" returns successfully" Jan 29 16:12:54.181289 kubelet[2663]: I0129 16:12:54.180715 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xs5zw" podStartSLOduration=5.180661689 podStartE2EDuration="5.180661689s" podCreationTimestamp="2025-01-29 16:12:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:12:51.259542406 +0000 UTC m=+7.278860200" watchObservedRunningTime="2025-01-29 16:12:54.180661689 +0000 UTC m=+10.199979504" Jan 29 16:12:54.285927 kubelet[2663]: I0129 16:12:54.285403 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-76c4976dd7-bx8fr" podStartSLOduration=1.3954517069999999 podStartE2EDuration="4.285301855s" podCreationTimestamp="2025-01-29 16:12:50 +0000 UTC" firstStartedPulling="2025-01-29 16:12:50.9037155 +0000 UTC m=+6.923033296" lastFinishedPulling="2025-01-29 16:12:53.793565647 +0000 UTC m=+9.812883444" observedRunningTime="2025-01-29 16:12:54.284359172 +0000 UTC m=+10.303676987" watchObservedRunningTime="2025-01-29 16:12:54.285301855 +0000 UTC m=+10.304619672" Jan 29 16:12:57.333844 kubelet[2663]: I0129 16:12:57.332590 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d150fe-b678-4784-b548-afba79174d20-tigera-ca-bundle\") pod \"calico-typha-55dcc9b747-gf926\" (UID: \"67d150fe-b678-4784-b548-afba79174d20\") " pod="calico-system/calico-typha-55dcc9b747-gf926" Jan 29 16:12:57.333844 kubelet[2663]: I0129 16:12:57.332690 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/67d150fe-b678-4784-b548-afba79174d20-typha-certs\") pod \"calico-typha-55dcc9b747-gf926\" (UID: \"67d150fe-b678-4784-b548-afba79174d20\") " pod="calico-system/calico-typha-55dcc9b747-gf926" Jan 29 16:12:57.333844 kubelet[2663]: I0129 16:12:57.332743 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr2c2\" (UniqueName: \"kubernetes.io/projected/67d150fe-b678-4784-b548-afba79174d20-kube-api-access-pr2c2\") pod \"calico-typha-55dcc9b747-gf926\" (UID: \"67d150fe-b678-4784-b548-afba79174d20\") " pod="calico-system/calico-typha-55dcc9b747-gf926" Jan 29 16:12:57.351970 systemd[1]: Created slice kubepods-besteffort-pod67d150fe_b678_4784_b548_afba79174d20.slice - libcontainer container kubepods-besteffort-pod67d150fe_b678_4784_b548_afba79174d20.slice. Jan 29 16:12:57.493773 systemd[1]: Created slice kubepods-besteffort-podac778ba4_a7a1_4547_bfec_fdc4b3b030c3.slice - libcontainer container kubepods-besteffort-podac778ba4_a7a1_4547_bfec_fdc4b3b030c3.slice. Jan 29 16:12:57.496864 kubelet[2663]: W0129 16:12:57.496525 2663 reflector.go:561] object-"calico-system"/"cni-config": failed to list *v1.ConfigMap: configmaps "cni-config" is forbidden: User "system:node:srv-6bdnt.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-6bdnt.gb1.brightbox.com' and this object Jan 29 16:12:57.497205 kubelet[2663]: E0129 16:12:57.497146 2663 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"cni-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-config\" is forbidden: User \"system:node:srv-6bdnt.gb1.brightbox.com\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-6bdnt.gb1.brightbox.com' and this object" logger="UnhandledError" Jan 29 16:12:57.497787 kubelet[2663]: W0129 16:12:57.497345 2663 reflector.go:561] object-"calico-system"/"node-certs": failed to list *v1.Secret: secrets "node-certs" is forbidden: User "system:node:srv-6bdnt.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'srv-6bdnt.gb1.brightbox.com' and this object Jan 29 16:12:57.497787 kubelet[2663]: E0129 16:12:57.497404 2663 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"node-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-certs\" is forbidden: User \"system:node:srv-6bdnt.gb1.brightbox.com\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'srv-6bdnt.gb1.brightbox.com' and this object" logger="UnhandledError" Jan 29 16:12:57.630622 kubelet[2663]: E0129 16:12:57.630378 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:12:57.641267 kubelet[2663]: I0129 16:12:57.640536 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-xtables-lock\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641267 kubelet[2663]: I0129 16:12:57.640596 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd9ph\" (UniqueName: \"kubernetes.io/projected/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-kube-api-access-bd9ph\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641267 kubelet[2663]: I0129 16:12:57.640628 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-tigera-ca-bundle\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641267 kubelet[2663]: I0129 16:12:57.640653 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-run-calico\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641267 kubelet[2663]: I0129 16:12:57.640733 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-lib-calico\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641770 kubelet[2663]: I0129 16:12:57.640763 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-net-dir\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641770 kubelet[2663]: I0129 16:12:57.640791 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-policysync\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641770 kubelet[2663]: I0129 16:12:57.640818 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-bin-dir\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641770 kubelet[2663]: I0129 16:12:57.640844 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-lib-modules\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.641770 kubelet[2663]: I0129 16:12:57.640870 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-node-certs\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.642038 kubelet[2663]: I0129 16:12:57.640895 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-log-dir\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.642038 kubelet[2663]: I0129 16:12:57.640923 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-flexvol-driver-host\") pod \"calico-node-p7vmg\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " pod="calico-system/calico-node-p7vmg" Jan 29 16:12:57.662211 containerd[1493]: time="2025-01-29T16:12:57.661926313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55dcc9b747-gf926,Uid:67d150fe-b678-4784-b548-afba79174d20,Namespace:calico-system,Attempt:0,}" Jan 29 16:12:57.744517 kubelet[2663]: I0129 16:12:57.742226 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e9d2429c-47e2-48a0-bc35-409f18438229-registration-dir\") pod \"csi-node-driver-gd895\" (UID: \"e9d2429c-47e2-48a0-bc35-409f18438229\") " pod="calico-system/csi-node-driver-gd895" Jan 29 16:12:57.745702 kubelet[2663]: I0129 16:12:57.744990 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e9d2429c-47e2-48a0-bc35-409f18438229-varrun\") pod \"csi-node-driver-gd895\" (UID: \"e9d2429c-47e2-48a0-bc35-409f18438229\") " pod="calico-system/csi-node-driver-gd895" Jan 29 16:12:57.745702 kubelet[2663]: I0129 16:12:57.745055 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e9d2429c-47e2-48a0-bc35-409f18438229-socket-dir\") pod \"csi-node-driver-gd895\" (UID: \"e9d2429c-47e2-48a0-bc35-409f18438229\") " pod="calico-system/csi-node-driver-gd895" Jan 29 16:12:57.745702 kubelet[2663]: I0129 16:12:57.745490 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e9d2429c-47e2-48a0-bc35-409f18438229-kubelet-dir\") pod \"csi-node-driver-gd895\" (UID: \"e9d2429c-47e2-48a0-bc35-409f18438229\") " pod="calico-system/csi-node-driver-gd895" Jan 29 16:12:57.745702 kubelet[2663]: I0129 16:12:57.745528 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-475bb\" (UniqueName: \"kubernetes.io/projected/e9d2429c-47e2-48a0-bc35-409f18438229-kube-api-access-475bb\") pod \"csi-node-driver-gd895\" (UID: \"e9d2429c-47e2-48a0-bc35-409f18438229\") " pod="calico-system/csi-node-driver-gd895" Jan 29 16:12:57.751636 kubelet[2663]: E0129 16:12:57.751021 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.751636 kubelet[2663]: W0129 16:12:57.751056 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.751636 kubelet[2663]: E0129 16:12:57.751524 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.753136 kubelet[2663]: E0129 16:12:57.752987 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.753136 kubelet[2663]: W0129 16:12:57.753006 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.753136 kubelet[2663]: E0129 16:12:57.753058 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.754029 kubelet[2663]: E0129 16:12:57.753826 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.755029 kubelet[2663]: W0129 16:12:57.754143 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.755029 kubelet[2663]: E0129 16:12:57.754210 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.757201 kubelet[2663]: E0129 16:12:57.755544 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.757201 kubelet[2663]: W0129 16:12:57.755573 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.757201 kubelet[2663]: E0129 16:12:57.755591 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.757950 kubelet[2663]: E0129 16:12:57.757648 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.757950 kubelet[2663]: W0129 16:12:57.757676 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.757950 kubelet[2663]: E0129 16:12:57.757705 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.758415 kubelet[2663]: E0129 16:12:57.758249 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.758415 kubelet[2663]: W0129 16:12:57.758277 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.758415 kubelet[2663]: E0129 16:12:57.758294 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.759218 kubelet[2663]: E0129 16:12:57.758954 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.759218 kubelet[2663]: W0129 16:12:57.758974 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.759218 kubelet[2663]: E0129 16:12:57.759001 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.763699 kubelet[2663]: E0129 16:12:57.763400 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.763699 kubelet[2663]: W0129 16:12:57.763418 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.763699 kubelet[2663]: E0129 16:12:57.763759 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.763699 kubelet[2663]: W0129 16:12:57.763773 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.764596 kubelet[2663]: E0129 16:12:57.764388 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.764596 kubelet[2663]: E0129 16:12:57.764451 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.765055 kubelet[2663]: E0129 16:12:57.764844 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.765055 kubelet[2663]: W0129 16:12:57.764897 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.765055 kubelet[2663]: E0129 16:12:57.764974 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.766088 kubelet[2663]: E0129 16:12:57.765755 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.766088 kubelet[2663]: W0129 16:12:57.765810 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.766088 kubelet[2663]: E0129 16:12:57.765929 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.768866 kubelet[2663]: E0129 16:12:57.768562 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.768866 kubelet[2663]: W0129 16:12:57.768593 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.768866 kubelet[2663]: E0129 16:12:57.768741 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.769510 kubelet[2663]: E0129 16:12:57.769298 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.769510 kubelet[2663]: W0129 16:12:57.769317 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.769510 kubelet[2663]: E0129 16:12:57.769365 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.770130 containerd[1493]: time="2025-01-29T16:12:57.769608825Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:57.770130 containerd[1493]: time="2025-01-29T16:12:57.769800492Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:57.770130 containerd[1493]: time="2025-01-29T16:12:57.769827173Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:57.770342 kubelet[2663]: E0129 16:12:57.769941 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.770342 kubelet[2663]: W0129 16:12:57.769955 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.772415 kubelet[2663]: E0129 16:12:57.770879 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.772415 kubelet[2663]: E0129 16:12:57.771516 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.772415 kubelet[2663]: W0129 16:12:57.771531 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.772415 kubelet[2663]: E0129 16:12:57.772037 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.772415 kubelet[2663]: E0129 16:12:57.772259 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.772415 kubelet[2663]: W0129 16:12:57.772273 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.772415 kubelet[2663]: E0129 16:12:57.772299 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.774733 kubelet[2663]: E0129 16:12:57.774710 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.776240 kubelet[2663]: W0129 16:12:57.775782 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.776240 kubelet[2663]: E0129 16:12:57.775818 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.778251 containerd[1493]: time="2025-01-29T16:12:57.777556676Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:57.778401 kubelet[2663]: E0129 16:12:57.778323 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.778401 kubelet[2663]: W0129 16:12:57.778393 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.779450 kubelet[2663]: E0129 16:12:57.778756 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.779450 kubelet[2663]: W0129 16:12:57.778771 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.779450 kubelet[2663]: E0129 16:12:57.779088 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.779450 kubelet[2663]: W0129 16:12:57.779101 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.779450 kubelet[2663]: E0129 16:12:57.779436 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.779450 kubelet[2663]: W0129 16:12:57.779450 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.779786 kubelet[2663]: E0129 16:12:57.779516 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.779786 kubelet[2663]: E0129 16:12:57.779545 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.779786 kubelet[2663]: E0129 16:12:57.779565 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.779914 kubelet[2663]: E0129 16:12:57.779889 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.779914 kubelet[2663]: W0129 16:12:57.779904 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.780026 kubelet[2663]: E0129 16:12:57.779937 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.780483 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.782224 kubelet[2663]: W0129 16:12:57.780503 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.780517 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.780551 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.780872 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.782224 kubelet[2663]: W0129 16:12:57.780887 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.780921 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.781460 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.782224 kubelet[2663]: W0129 16:12:57.781496 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.782224 kubelet[2663]: E0129 16:12:57.781511 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.810069 kubelet[2663]: E0129 16:12:57.807362 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.810069 kubelet[2663]: W0129 16:12:57.807385 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.810069 kubelet[2663]: E0129 16:12:57.807403 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.847275 kubelet[2663]: E0129 16:12:57.847228 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.847548 kubelet[2663]: W0129 16:12:57.847521 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.847722 kubelet[2663]: E0129 16:12:57.847674 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.848877 kubelet[2663]: E0129 16:12:57.848856 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.849640 systemd[1]: Started cri-containerd-9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794.scope - libcontainer container 9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794. Jan 29 16:12:57.849935 kubelet[2663]: W0129 16:12:57.849910 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.850058 kubelet[2663]: E0129 16:12:57.850038 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.851947 kubelet[2663]: E0129 16:12:57.851214 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.851947 kubelet[2663]: W0129 16:12:57.851233 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.851947 kubelet[2663]: E0129 16:12:57.851249 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.852280 kubelet[2663]: E0129 16:12:57.852261 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.852839 kubelet[2663]: W0129 16:12:57.852773 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.853181 kubelet[2663]: E0129 16:12:57.853004 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.853607 kubelet[2663]: E0129 16:12:57.853556 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.853798 kubelet[2663]: W0129 16:12:57.853713 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.853798 kubelet[2663]: E0129 16:12:57.853736 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.854736 kubelet[2663]: E0129 16:12:57.854654 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.854736 kubelet[2663]: W0129 16:12:57.854674 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.855192 kubelet[2663]: E0129 16:12:57.855022 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.855720 kubelet[2663]: E0129 16:12:57.855634 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.855720 kubelet[2663]: W0129 16:12:57.855652 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.856216 kubelet[2663]: E0129 16:12:57.855676 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.857747 kubelet[2663]: E0129 16:12:57.857605 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.857747 kubelet[2663]: W0129 16:12:57.857638 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.857747 kubelet[2663]: E0129 16:12:57.857677 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.858816 kubelet[2663]: E0129 16:12:57.858791 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.858816 kubelet[2663]: W0129 16:12:57.858811 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.859235 kubelet[2663]: E0129 16:12:57.858846 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.859890 kubelet[2663]: E0129 16:12:57.859867 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.859890 kubelet[2663]: W0129 16:12:57.859887 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.860137 kubelet[2663]: E0129 16:12:57.859971 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.861275 kubelet[2663]: E0129 16:12:57.861253 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.861275 kubelet[2663]: W0129 16:12:57.861271 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.861596 kubelet[2663]: E0129 16:12:57.861344 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.861958 kubelet[2663]: E0129 16:12:57.861935 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.861958 kubelet[2663]: W0129 16:12:57.861954 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.862277 kubelet[2663]: E0129 16:12:57.862041 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.863300 kubelet[2663]: E0129 16:12:57.863279 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.863300 kubelet[2663]: W0129 16:12:57.863297 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.863488 kubelet[2663]: E0129 16:12:57.863372 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.864288 kubelet[2663]: E0129 16:12:57.864267 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.864449 kubelet[2663]: W0129 16:12:57.864294 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.864449 kubelet[2663]: E0129 16:12:57.864372 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.864812 kubelet[2663]: E0129 16:12:57.864789 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.864812 kubelet[2663]: W0129 16:12:57.864809 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.865022 kubelet[2663]: E0129 16:12:57.864887 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.866667 kubelet[2663]: E0129 16:12:57.866265 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.866667 kubelet[2663]: W0129 16:12:57.866284 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.866667 kubelet[2663]: E0129 16:12:57.866619 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.867529 kubelet[2663]: E0129 16:12:57.867484 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.867851 kubelet[2663]: W0129 16:12:57.867639 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.867979 kubelet[2663]: E0129 16:12:57.867956 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.868497 kubelet[2663]: E0129 16:12:57.868351 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.868497 kubelet[2663]: W0129 16:12:57.868369 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.869426 kubelet[2663]: E0129 16:12:57.869271 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.869426 kubelet[2663]: W0129 16:12:57.869302 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.869961 kubelet[2663]: E0129 16:12:57.869843 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.869961 kubelet[2663]: W0129 16:12:57.869862 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.870448 kubelet[2663]: E0129 16:12:57.870227 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.870448 kubelet[2663]: E0129 16:12:57.870291 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.870448 kubelet[2663]: W0129 16:12:57.870306 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.870796 kubelet[2663]: E0129 16:12:57.870776 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.870934 kubelet[2663]: W0129 16:12:57.870912 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.872217 kubelet[2663]: E0129 16:12:57.871057 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.872217 kubelet[2663]: E0129 16:12:57.870801 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.872421 kubelet[2663]: E0129 16:12:57.870810 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.872640 kubelet[2663]: E0129 16:12:57.870789 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.873012 kubelet[2663]: E0129 16:12:57.872869 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.873012 kubelet[2663]: W0129 16:12:57.872888 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.873012 kubelet[2663]: E0129 16:12:57.872915 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.873506 kubelet[2663]: E0129 16:12:57.873483 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.873506 kubelet[2663]: W0129 16:12:57.873502 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.873593 kubelet[2663]: E0129 16:12:57.873524 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.874201 kubelet[2663]: E0129 16:12:57.874048 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.874201 kubelet[2663]: W0129 16:12:57.874067 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.874201 kubelet[2663]: E0129 16:12:57.874093 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.874723 kubelet[2663]: E0129 16:12:57.874627 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.874723 kubelet[2663]: W0129 16:12:57.874644 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.874723 kubelet[2663]: E0129 16:12:57.874662 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.910332 kubelet[2663]: E0129 16:12:57.908526 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.910332 kubelet[2663]: W0129 16:12:57.910212 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.910332 kubelet[2663]: E0129 16:12:57.910251 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:57.958462 kubelet[2663]: E0129 16:12:57.957804 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:57.958462 kubelet[2663]: W0129 16:12:57.958443 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:57.958634 kubelet[2663]: E0129 16:12:57.958521 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:58.005198 containerd[1493]: time="2025-01-29T16:12:58.005068381Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-55dcc9b747-gf926,Uid:67d150fe-b678-4784-b548-afba79174d20,Namespace:calico-system,Attempt:0,} returns sandbox id \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\"" Jan 29 16:12:58.009093 containerd[1493]: time="2025-01-29T16:12:58.009050694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 29 16:12:58.059343 kubelet[2663]: E0129 16:12:58.059313 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:58.059343 kubelet[2663]: W0129 16:12:58.059339 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:58.059520 kubelet[2663]: E0129 16:12:58.059363 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:58.162334 kubelet[2663]: E0129 16:12:58.161286 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:58.162334 kubelet[2663]: W0129 16:12:58.161326 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:58.162334 kubelet[2663]: E0129 16:12:58.161358 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:58.265118 kubelet[2663]: E0129 16:12:58.265069 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:58.265118 kubelet[2663]: W0129 16:12:58.265104 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:58.265118 kubelet[2663]: E0129 16:12:58.265129 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:58.366827 kubelet[2663]: E0129 16:12:58.366780 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:58.366827 kubelet[2663]: W0129 16:12:58.366813 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:58.366827 kubelet[2663]: E0129 16:12:58.366839 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:58.442347 kubelet[2663]: E0129 16:12:58.442228 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:12:58.442347 kubelet[2663]: W0129 16:12:58.442263 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:12:58.442347 kubelet[2663]: E0129 16:12:58.442288 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:12:58.703038 containerd[1493]: time="2025-01-29T16:12:58.702357944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-p7vmg,Uid:ac778ba4-a7a1-4547-bfec-fdc4b3b030c3,Namespace:calico-system,Attempt:0,}" Jan 29 16:12:58.738008 containerd[1493]: time="2025-01-29T16:12:58.737837899Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:12:58.738913 containerd[1493]: time="2025-01-29T16:12:58.738824965Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:12:58.738913 containerd[1493]: time="2025-01-29T16:12:58.738856525Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:58.739205 containerd[1493]: time="2025-01-29T16:12:58.738989574Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:12:58.781427 systemd[1]: Started cri-containerd-4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409.scope - libcontainer container 4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409. Jan 29 16:12:58.822491 containerd[1493]: time="2025-01-29T16:12:58.822418289Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-p7vmg,Uid:ac778ba4-a7a1-4547-bfec-fdc4b3b030c3,Namespace:calico-system,Attempt:0,} returns sandbox id \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\"" Jan 29 16:12:59.171827 kubelet[2663]: E0129 16:12:59.171619 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:12:59.543042 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount63581351.mount: Deactivated successfully. Jan 29 16:13:01.163690 containerd[1493]: time="2025-01-29T16:13:01.162647878Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:01.166420 containerd[1493]: time="2025-01-29T16:13:01.166141384Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 29 16:13:01.167725 containerd[1493]: time="2025-01-29T16:13:01.167315621Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:01.176122 kubelet[2663]: E0129 16:13:01.175891 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:01.181364 containerd[1493]: time="2025-01-29T16:13:01.181196847Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:01.184868 containerd[1493]: time="2025-01-29T16:13:01.184820273Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 3.175576417s" Jan 29 16:13:01.185925 containerd[1493]: time="2025-01-29T16:13:01.185891806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 29 16:13:01.191076 containerd[1493]: time="2025-01-29T16:13:01.191041316Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 29 16:13:01.226600 containerd[1493]: time="2025-01-29T16:13:01.226244681Z" level=info msg="CreateContainer within sandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 29 16:13:01.252143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4157381727.mount: Deactivated successfully. Jan 29 16:13:01.255747 containerd[1493]: time="2025-01-29T16:13:01.255701678Z" level=info msg="CreateContainer within sandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\"" Jan 29 16:13:01.258215 containerd[1493]: time="2025-01-29T16:13:01.257100041Z" level=info msg="StartContainer for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\"" Jan 29 16:13:01.324445 systemd[1]: Started cri-containerd-ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400.scope - libcontainer container ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400. Jan 29 16:13:01.430209 containerd[1493]: time="2025-01-29T16:13:01.430027270Z" level=info msg="StartContainer for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" returns successfully" Jan 29 16:13:02.334419 kubelet[2663]: I0129 16:13:02.334263 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-55dcc9b747-gf926" podStartSLOduration=2.154185317 podStartE2EDuration="5.334129495s" podCreationTimestamp="2025-01-29 16:12:57 +0000 UTC" firstStartedPulling="2025-01-29 16:12:58.007860711 +0000 UTC m=+14.027178508" lastFinishedPulling="2025-01-29 16:13:01.187804892 +0000 UTC m=+17.207122686" observedRunningTime="2025-01-29 16:13:02.330582719 +0000 UTC m=+18.349900519" watchObservedRunningTime="2025-01-29 16:13:02.334129495 +0000 UTC m=+18.353447300" Jan 29 16:13:02.388289 kubelet[2663]: E0129 16:13:02.388247 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.388289 kubelet[2663]: W0129 16:13:02.388286 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.388507 kubelet[2663]: E0129 16:13:02.388335 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.388663 kubelet[2663]: E0129 16:13:02.388633 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.388663 kubelet[2663]: W0129 16:13:02.388654 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.388792 kubelet[2663]: E0129 16:13:02.388670 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.388971 kubelet[2663]: E0129 16:13:02.388948 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.388971 kubelet[2663]: W0129 16:13:02.388968 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.389087 kubelet[2663]: E0129 16:13:02.388983 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.389289 kubelet[2663]: E0129 16:13:02.389266 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.389289 kubelet[2663]: W0129 16:13:02.389286 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.389434 kubelet[2663]: E0129 16:13:02.389301 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.389606 kubelet[2663]: E0129 16:13:02.389584 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.389606 kubelet[2663]: W0129 16:13:02.389605 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.389730 kubelet[2663]: E0129 16:13:02.389620 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.389946 kubelet[2663]: E0129 16:13:02.389921 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.389946 kubelet[2663]: W0129 16:13:02.389941 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.390066 kubelet[2663]: E0129 16:13:02.389957 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.390296 kubelet[2663]: E0129 16:13:02.390258 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.390296 kubelet[2663]: W0129 16:13:02.390278 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.390296 kubelet[2663]: E0129 16:13:02.390293 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.390621 kubelet[2663]: E0129 16:13:02.390568 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.390621 kubelet[2663]: W0129 16:13:02.390588 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.390621 kubelet[2663]: E0129 16:13:02.390603 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.390882 kubelet[2663]: E0129 16:13:02.390863 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.390882 kubelet[2663]: W0129 16:13:02.390882 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.391030 kubelet[2663]: E0129 16:13:02.390897 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.391185 kubelet[2663]: E0129 16:13:02.391153 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.391278 kubelet[2663]: W0129 16:13:02.391195 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.391278 kubelet[2663]: E0129 16:13:02.391213 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.391495 kubelet[2663]: E0129 16:13:02.391476 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.391495 kubelet[2663]: W0129 16:13:02.391494 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.391608 kubelet[2663]: E0129 16:13:02.391510 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.391772 kubelet[2663]: E0129 16:13:02.391752 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.391772 kubelet[2663]: W0129 16:13:02.391772 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.391894 kubelet[2663]: E0129 16:13:02.391787 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.392077 kubelet[2663]: E0129 16:13:02.392056 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.392143 kubelet[2663]: W0129 16:13:02.392076 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.392143 kubelet[2663]: E0129 16:13:02.392092 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.392388 kubelet[2663]: E0129 16:13:02.392367 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.392388 kubelet[2663]: W0129 16:13:02.392386 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.392510 kubelet[2663]: E0129 16:13:02.392401 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.392668 kubelet[2663]: E0129 16:13:02.392647 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.392668 kubelet[2663]: W0129 16:13:02.392666 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.392780 kubelet[2663]: E0129 16:13:02.392681 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.403029 kubelet[2663]: E0129 16:13:02.403003 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.403329 kubelet[2663]: W0129 16:13:02.403154 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.403329 kubelet[2663]: E0129 16:13:02.403204 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.404202 kubelet[2663]: E0129 16:13:02.403935 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.404202 kubelet[2663]: W0129 16:13:02.403954 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.404202 kubelet[2663]: E0129 16:13:02.403980 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.404665 kubelet[2663]: E0129 16:13:02.404249 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.404665 kubelet[2663]: W0129 16:13:02.404266 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.404665 kubelet[2663]: E0129 16:13:02.404290 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.405077 kubelet[2663]: E0129 16:13:02.404735 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.405077 kubelet[2663]: W0129 16:13:02.404749 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.405077 kubelet[2663]: E0129 16:13:02.404783 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.405077 kubelet[2663]: E0129 16:13:02.405074 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.405330 kubelet[2663]: W0129 16:13:02.405088 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.405330 kubelet[2663]: E0129 16:13:02.405104 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.407378 kubelet[2663]: E0129 16:13:02.407283 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.407378 kubelet[2663]: W0129 16:13:02.407305 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.407521 kubelet[2663]: E0129 16:13:02.407435 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.408466 kubelet[2663]: E0129 16:13:02.408443 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.408745 kubelet[2663]: W0129 16:13:02.408588 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.408930 kubelet[2663]: E0129 16:13:02.408825 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.409181 kubelet[2663]: E0129 16:13:02.409072 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.409181 kubelet[2663]: W0129 16:13:02.409092 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.409181 kubelet[2663]: E0129 16:13:02.409140 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.409713 kubelet[2663]: E0129 16:13:02.409588 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.409713 kubelet[2663]: W0129 16:13:02.409605 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.409713 kubelet[2663]: E0129 16:13:02.409632 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.409908 kubelet[2663]: E0129 16:13:02.409886 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.409989 kubelet[2663]: W0129 16:13:02.409907 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.409989 kubelet[2663]: E0129 16:13:02.409941 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.410274 kubelet[2663]: E0129 16:13:02.410231 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.410274 kubelet[2663]: W0129 16:13:02.410269 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.410482 kubelet[2663]: E0129 16:13:02.410292 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.410545 kubelet[2663]: E0129 16:13:02.410528 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.410545 kubelet[2663]: W0129 16:13:02.410542 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.410723 kubelet[2663]: E0129 16:13:02.410575 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.410810 kubelet[2663]: E0129 16:13:02.410790 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.410810 kubelet[2663]: W0129 16:13:02.410809 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.411038 kubelet[2663]: E0129 16:13:02.410830 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.411116 kubelet[2663]: E0129 16:13:02.411100 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.411116 kubelet[2663]: W0129 16:13:02.411114 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.411304 kubelet[2663]: E0129 16:13:02.411136 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.411488 kubelet[2663]: E0129 16:13:02.411468 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.411488 kubelet[2663]: W0129 16:13:02.411488 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.411616 kubelet[2663]: E0129 16:13:02.411511 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.412243 kubelet[2663]: E0129 16:13:02.412010 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.412243 kubelet[2663]: W0129 16:13:02.412029 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.412243 kubelet[2663]: E0129 16:13:02.412057 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.412825 kubelet[2663]: E0129 16:13:02.412310 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.412825 kubelet[2663]: W0129 16:13:02.412325 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.412825 kubelet[2663]: E0129 16:13:02.412353 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:02.413286 kubelet[2663]: E0129 16:13:02.413207 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:02.413286 kubelet[2663]: W0129 16:13:02.413226 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:02.413286 kubelet[2663]: E0129 16:13:02.413243 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.169838 kubelet[2663]: E0129 16:13:03.169748 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:03.304589 containerd[1493]: time="2025-01-29T16:13:03.304468884Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:03.313425 containerd[1493]: time="2025-01-29T16:13:03.313363347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 29 16:13:03.314735 kubelet[2663]: I0129 16:13:03.314110 2663 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 16:13:03.318328 containerd[1493]: time="2025-01-29T16:13:03.318293357Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:03.324282 containerd[1493]: time="2025-01-29T16:13:03.324243766Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:03.326524 containerd[1493]: time="2025-01-29T16:13:03.326486341Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 2.135226391s" Jan 29 16:13:03.326812 containerd[1493]: time="2025-01-29T16:13:03.326649885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 29 16:13:03.332683 containerd[1493]: time="2025-01-29T16:13:03.332445718Z" level=info msg="CreateContainer within sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 29 16:13:03.383262 containerd[1493]: time="2025-01-29T16:13:03.383198559Z" level=info msg="CreateContainer within sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\"" Jan 29 16:13:03.387727 containerd[1493]: time="2025-01-29T16:13:03.384073292Z" level=info msg="StartContainer for \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\"" Jan 29 16:13:03.400198 kubelet[2663]: E0129 16:13:03.400134 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.400809 kubelet[2663]: W0129 16:13:03.400747 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.400885 kubelet[2663]: E0129 16:13:03.400810 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.403299 kubelet[2663]: E0129 16:13:03.403258 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.403299 kubelet[2663]: W0129 16:13:03.403298 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.403925 kubelet[2663]: E0129 16:13:03.403317 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.405724 kubelet[2663]: E0129 16:13:03.405703 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.405724 kubelet[2663]: W0129 16:13:03.405724 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.405912 kubelet[2663]: E0129 16:13:03.405740 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.407105 kubelet[2663]: E0129 16:13:03.406983 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.407105 kubelet[2663]: W0129 16:13:03.407004 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.407105 kubelet[2663]: E0129 16:13:03.407021 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.408410 kubelet[2663]: E0129 16:13:03.408388 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.408410 kubelet[2663]: W0129 16:13:03.408411 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.408562 kubelet[2663]: E0129 16:13:03.408427 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.409118 kubelet[2663]: E0129 16:13:03.409088 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.409118 kubelet[2663]: W0129 16:13:03.409111 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.409275 kubelet[2663]: E0129 16:13:03.409128 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.410278 kubelet[2663]: E0129 16:13:03.410092 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.410278 kubelet[2663]: W0129 16:13:03.410113 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.410278 kubelet[2663]: E0129 16:13:03.410129 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.411150 kubelet[2663]: E0129 16:13:03.411115 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.411150 kubelet[2663]: W0129 16:13:03.411139 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.411296 kubelet[2663]: E0129 16:13:03.411156 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.411534 kubelet[2663]: E0129 16:13:03.411499 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.411534 kubelet[2663]: W0129 16:13:03.411530 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.411896 kubelet[2663]: E0129 16:13:03.411544 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.412608 kubelet[2663]: E0129 16:13:03.412588 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.412608 kubelet[2663]: W0129 16:13:03.412605 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.412753 kubelet[2663]: E0129 16:13:03.412627 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.413577 kubelet[2663]: E0129 16:13:03.412879 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.413577 kubelet[2663]: W0129 16:13:03.412898 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.413577 kubelet[2663]: E0129 16:13:03.412913 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.413577 kubelet[2663]: E0129 16:13:03.413160 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.413577 kubelet[2663]: W0129 16:13:03.413206 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.413577 kubelet[2663]: E0129 16:13:03.413223 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.413577 kubelet[2663]: E0129 16:13:03.413577 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.414159 kubelet[2663]: W0129 16:13:03.413591 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.414159 kubelet[2663]: E0129 16:13:03.413614 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.414159 kubelet[2663]: E0129 16:13:03.414002 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.414159 kubelet[2663]: W0129 16:13:03.414017 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.414159 kubelet[2663]: E0129 16:13:03.414043 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.416544 kubelet[2663]: E0129 16:13:03.414466 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.416544 kubelet[2663]: W0129 16:13:03.414493 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.416544 kubelet[2663]: E0129 16:13:03.414508 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.416700 kubelet[2663]: E0129 16:13:03.416544 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.416700 kubelet[2663]: W0129 16:13:03.416563 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.416700 kubelet[2663]: E0129 16:13:03.416580 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.419533 kubelet[2663]: E0129 16:13:03.419338 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.419533 kubelet[2663]: W0129 16:13:03.419360 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.419533 kubelet[2663]: E0129 16:13:03.419385 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.422115 kubelet[2663]: E0129 16:13:03.420421 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.422115 kubelet[2663]: W0129 16:13:03.420444 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.422115 kubelet[2663]: E0129 16:13:03.420938 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.422115 kubelet[2663]: E0129 16:13:03.421266 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.422115 kubelet[2663]: W0129 16:13:03.422011 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.422115 kubelet[2663]: E0129 16:13:03.422047 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.423531 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427192 kubelet[2663]: W0129 16:13:03.423578 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.423670 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.425602 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427192 kubelet[2663]: W0129 16:13:03.425618 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.425750 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.425936 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427192 kubelet[2663]: W0129 16:13:03.425975 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.426079 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427192 kubelet[2663]: E0129 16:13:03.426275 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427756 kubelet[2663]: W0129 16:13:03.426307 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.427756 kubelet[2663]: E0129 16:13:03.426409 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427756 kubelet[2663]: E0129 16:13:03.426649 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427756 kubelet[2663]: W0129 16:13:03.426670 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.427756 kubelet[2663]: E0129 16:13:03.426692 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427756 kubelet[2663]: E0129 16:13:03.427000 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427756 kubelet[2663]: W0129 16:13:03.427014 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.427756 kubelet[2663]: E0129 16:13:03.427036 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.427756 kubelet[2663]: E0129 16:13:03.427350 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.427756 kubelet[2663]: W0129 16:13:03.427375 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.428280 kubelet[2663]: E0129 16:13:03.427493 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.428280 kubelet[2663]: E0129 16:13:03.427934 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.428280 kubelet[2663]: W0129 16:13:03.427948 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.428280 kubelet[2663]: E0129 16:13:03.428205 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.428496 kubelet[2663]: E0129 16:13:03.428299 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.428496 kubelet[2663]: W0129 16:13:03.428311 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.428496 kubelet[2663]: E0129 16:13:03.428468 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.429226 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.433558 kubelet[2663]: W0129 16:13:03.429246 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.429290 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.430563 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.433558 kubelet[2663]: W0129 16:13:03.430578 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.430592 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.432101 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.433558 kubelet[2663]: W0129 16:13:03.432147 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.432190 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.433558 kubelet[2663]: E0129 16:13:03.432691 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.435875 kubelet[2663]: W0129 16:13:03.432735 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.435875 kubelet[2663]: E0129 16:13:03.432770 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.435875 kubelet[2663]: E0129 16:13:03.433056 2663 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 29 16:13:03.435875 kubelet[2663]: W0129 16:13:03.433071 2663 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 29 16:13:03.435875 kubelet[2663]: E0129 16:13:03.433086 2663 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 29 16:13:03.496404 systemd[1]: Started cri-containerd-50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030.scope - libcontainer container 50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030. Jan 29 16:13:03.547101 containerd[1493]: time="2025-01-29T16:13:03.547051620Z" level=info msg="StartContainer for \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\" returns successfully" Jan 29 16:13:03.596097 systemd[1]: cri-containerd-50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030.scope: Deactivated successfully. Jan 29 16:13:03.640154 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030-rootfs.mount: Deactivated successfully. Jan 29 16:13:03.797696 containerd[1493]: time="2025-01-29T16:13:03.786358228Z" level=info msg="shim disconnected" id=50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030 namespace=k8s.io Jan 29 16:13:03.797696 containerd[1493]: time="2025-01-29T16:13:03.797681695Z" level=warning msg="cleaning up after shim disconnected" id=50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030 namespace=k8s.io Jan 29 16:13:03.798238 containerd[1493]: time="2025-01-29T16:13:03.797740958Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:04.320464 containerd[1493]: time="2025-01-29T16:13:04.319676544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 29 16:13:05.169670 kubelet[2663]: E0129 16:13:05.169572 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:07.169868 kubelet[2663]: E0129 16:13:07.169728 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:09.170370 kubelet[2663]: E0129 16:13:09.169948 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:10.747511 containerd[1493]: time="2025-01-29T16:13:10.747395463Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:10.749027 containerd[1493]: time="2025-01-29T16:13:10.748956907Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 29 16:13:10.750321 containerd[1493]: time="2025-01-29T16:13:10.749623631Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:10.753060 containerd[1493]: time="2025-01-29T16:13:10.753008484Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:10.754242 containerd[1493]: time="2025-01-29T16:13:10.754206936Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 6.434437652s" Jan 29 16:13:10.754409 containerd[1493]: time="2025-01-29T16:13:10.754378963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 29 16:13:10.759420 containerd[1493]: time="2025-01-29T16:13:10.759378931Z" level=info msg="CreateContainer within sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 29 16:13:10.784666 containerd[1493]: time="2025-01-29T16:13:10.784309689Z" level=info msg="CreateContainer within sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\"" Jan 29 16:13:10.789756 containerd[1493]: time="2025-01-29T16:13:10.786343171Z" level=info msg="StartContainer for \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\"" Jan 29 16:13:10.791203 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1139298329.mount: Deactivated successfully. Jan 29 16:13:10.884822 systemd[1]: Started cri-containerd-33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7.scope - libcontainer container 33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7. Jan 29 16:13:10.938213 containerd[1493]: time="2025-01-29T16:13:10.937974425Z" level=info msg="StartContainer for \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\" returns successfully" Jan 29 16:13:11.169767 kubelet[2663]: E0129 16:13:11.169500 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:11.949828 systemd[1]: cri-containerd-33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7.scope: Deactivated successfully. Jan 29 16:13:12.006124 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7-rootfs.mount: Deactivated successfully. Jan 29 16:13:12.068427 kubelet[2663]: I0129 16:13:12.067463 2663 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jan 29 16:13:12.123521 containerd[1493]: time="2025-01-29T16:13:12.123003575Z" level=info msg="shim disconnected" id=33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7 namespace=k8s.io Jan 29 16:13:12.123521 containerd[1493]: time="2025-01-29T16:13:12.123522101Z" level=warning msg="cleaning up after shim disconnected" id=33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7 namespace=k8s.io Jan 29 16:13:12.124687 containerd[1493]: time="2025-01-29T16:13:12.123566103Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:12.188088 kubelet[2663]: I0129 16:13:12.187548 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9pmg\" (UniqueName: \"kubernetes.io/projected/d480db07-6802-4f6a-b925-8f4c938e9f7b-kube-api-access-j9pmg\") pod \"calico-kube-controllers-85cfdd4458-rcthl\" (UID: \"d480db07-6802-4f6a-b925-8f4c938e9f7b\") " pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" Jan 29 16:13:12.188088 kubelet[2663]: I0129 16:13:12.187601 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/424af8c2-52c0-4e44-8235-7c2d50b2f3f1-config-volume\") pod \"coredns-6f6b679f8f-9s2kl\" (UID: \"424af8c2-52c0-4e44-8235-7c2d50b2f3f1\") " pod="kube-system/coredns-6f6b679f8f-9s2kl" Jan 29 16:13:12.188088 kubelet[2663]: I0129 16:13:12.187636 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d480db07-6802-4f6a-b925-8f4c938e9f7b-tigera-ca-bundle\") pod \"calico-kube-controllers-85cfdd4458-rcthl\" (UID: \"d480db07-6802-4f6a-b925-8f4c938e9f7b\") " pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" Jan 29 16:13:12.188088 kubelet[2663]: I0129 16:13:12.187673 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjxhf\" (UniqueName: \"kubernetes.io/projected/8ae120f8-d2d4-415d-b245-151e25c130c4-kube-api-access-tjxhf\") pod \"calico-apiserver-c7c6f64f5-hshkw\" (UID: \"8ae120f8-d2d4-415d-b245-151e25c130c4\") " pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" Jan 29 16:13:12.188088 kubelet[2663]: I0129 16:13:12.187700 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swlnc\" (UniqueName: \"kubernetes.io/projected/7c206eb9-9889-4da7-833e-4f8997709e6e-kube-api-access-swlnc\") pod \"calico-apiserver-c7c6f64f5-jnchp\" (UID: \"7c206eb9-9889-4da7-833e-4f8997709e6e\") " pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" Jan 29 16:13:12.190393 kubelet[2663]: I0129 16:13:12.187726 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqx8\" (UniqueName: \"kubernetes.io/projected/424af8c2-52c0-4e44-8235-7c2d50b2f3f1-kube-api-access-glqx8\") pod \"coredns-6f6b679f8f-9s2kl\" (UID: \"424af8c2-52c0-4e44-8235-7c2d50b2f3f1\") " pod="kube-system/coredns-6f6b679f8f-9s2kl" Jan 29 16:13:12.190393 kubelet[2663]: I0129 16:13:12.187751 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2hzk\" (UniqueName: \"kubernetes.io/projected/ba3c4db0-9089-4994-8c25-05651ef8025d-kube-api-access-n2hzk\") pod \"coredns-6f6b679f8f-hvt88\" (UID: \"ba3c4db0-9089-4994-8c25-05651ef8025d\") " pod="kube-system/coredns-6f6b679f8f-hvt88" Jan 29 16:13:12.190393 kubelet[2663]: I0129 16:13:12.187777 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7c206eb9-9889-4da7-833e-4f8997709e6e-calico-apiserver-certs\") pod \"calico-apiserver-c7c6f64f5-jnchp\" (UID: \"7c206eb9-9889-4da7-833e-4f8997709e6e\") " pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" Jan 29 16:13:12.190393 kubelet[2663]: I0129 16:13:12.187811 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8ae120f8-d2d4-415d-b245-151e25c130c4-calico-apiserver-certs\") pod \"calico-apiserver-c7c6f64f5-hshkw\" (UID: \"8ae120f8-d2d4-415d-b245-151e25c130c4\") " pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" Jan 29 16:13:12.190393 kubelet[2663]: I0129 16:13:12.187846 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba3c4db0-9089-4994-8c25-05651ef8025d-config-volume\") pod \"coredns-6f6b679f8f-hvt88\" (UID: \"ba3c4db0-9089-4994-8c25-05651ef8025d\") " pod="kube-system/coredns-6f6b679f8f-hvt88" Jan 29 16:13:12.206231 systemd[1]: Created slice kubepods-burstable-pod424af8c2_52c0_4e44_8235_7c2d50b2f3f1.slice - libcontainer container kubepods-burstable-pod424af8c2_52c0_4e44_8235_7c2d50b2f3f1.slice. Jan 29 16:13:12.220650 systemd[1]: Created slice kubepods-besteffort-podd480db07_6802_4f6a_b925_8f4c938e9f7b.slice - libcontainer container kubepods-besteffort-podd480db07_6802_4f6a_b925_8f4c938e9f7b.slice. Jan 29 16:13:12.239960 systemd[1]: Created slice kubepods-besteffort-pod7c206eb9_9889_4da7_833e_4f8997709e6e.slice - libcontainer container kubepods-besteffort-pod7c206eb9_9889_4da7_833e_4f8997709e6e.slice. Jan 29 16:13:12.247107 systemd[1]: Created slice kubepods-besteffort-pod8ae120f8_d2d4_415d_b245_151e25c130c4.slice - libcontainer container kubepods-besteffort-pod8ae120f8_d2d4_415d_b245_151e25c130c4.slice. Jan 29 16:13:12.268503 systemd[1]: Created slice kubepods-burstable-podba3c4db0_9089_4994_8c25_05651ef8025d.slice - libcontainer container kubepods-burstable-podba3c4db0_9089_4994_8c25_05651ef8025d.slice. Jan 29 16:13:12.373247 containerd[1493]: time="2025-01-29T16:13:12.373038105Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 29 16:13:12.519632 containerd[1493]: time="2025-01-29T16:13:12.519575326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9s2kl,Uid:424af8c2-52c0-4e44-8235-7c2d50b2f3f1,Namespace:kube-system,Attempt:0,}" Jan 29 16:13:12.527196 containerd[1493]: time="2025-01-29T16:13:12.527099234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85cfdd4458-rcthl,Uid:d480db07-6802-4f6a-b925-8f4c938e9f7b,Namespace:calico-system,Attempt:0,}" Jan 29 16:13:12.555091 containerd[1493]: time="2025-01-29T16:13:12.554759663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-jnchp,Uid:7c206eb9-9889-4da7-833e-4f8997709e6e,Namespace:calico-apiserver,Attempt:0,}" Jan 29 16:13:12.567662 containerd[1493]: time="2025-01-29T16:13:12.567610087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-hshkw,Uid:8ae120f8-d2d4-415d-b245-151e25c130c4,Namespace:calico-apiserver,Attempt:0,}" Jan 29 16:13:12.576411 containerd[1493]: time="2025-01-29T16:13:12.576363714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-hvt88,Uid:ba3c4db0-9089-4994-8c25-05651ef8025d,Namespace:kube-system,Attempt:0,}" Jan 29 16:13:12.937744 containerd[1493]: time="2025-01-29T16:13:12.937479401Z" level=error msg="Failed to destroy network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.953768 containerd[1493]: time="2025-01-29T16:13:12.953223057Z" level=error msg="encountered an error cleaning up failed sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.953768 containerd[1493]: time="2025-01-29T16:13:12.953320040Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-hvt88,Uid:ba3c4db0-9089-4994-8c25-05651ef8025d,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.961290 containerd[1493]: time="2025-01-29T16:13:12.961250342Z" level=error msg="Failed to destroy network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.962321 containerd[1493]: time="2025-01-29T16:13:12.961898618Z" level=error msg="encountered an error cleaning up failed sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.962321 containerd[1493]: time="2025-01-29T16:13:12.961966600Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9s2kl,Uid:424af8c2-52c0-4e44-8235-7c2d50b2f3f1,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.976544 containerd[1493]: time="2025-01-29T16:13:12.974702113Z" level=error msg="Failed to destroy network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.978752 kubelet[2663]: E0129 16:13:12.976890 2663 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.978752 kubelet[2663]: E0129 16:13:12.977096 2663 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9s2kl" Jan 29 16:13:12.978752 kubelet[2663]: E0129 16:13:12.977142 2663 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9s2kl" Jan 29 16:13:12.980117 kubelet[2663]: E0129 16:13:12.977247 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9s2kl_kube-system(424af8c2-52c0-4e44-8235-7c2d50b2f3f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9s2kl_kube-system(424af8c2-52c0-4e44-8235-7c2d50b2f3f1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9s2kl" podUID="424af8c2-52c0-4e44-8235-7c2d50b2f3f1" Jan 29 16:13:12.980117 kubelet[2663]: E0129 16:13:12.977598 2663 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.980117 kubelet[2663]: E0129 16:13:12.977633 2663 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-hvt88" Jan 29 16:13:12.981279 kubelet[2663]: E0129 16:13:12.977658 2663 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-hvt88" Jan 29 16:13:12.981279 kubelet[2663]: E0129 16:13:12.977699 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-hvt88_kube-system(ba3c4db0-9089-4994-8c25-05651ef8025d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-hvt88_kube-system(ba3c4db0-9089-4994-8c25-05651ef8025d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-hvt88" podUID="ba3c4db0-9089-4994-8c25-05651ef8025d" Jan 29 16:13:12.982228 containerd[1493]: time="2025-01-29T16:13:12.974939803Z" level=error msg="Failed to destroy network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.983060 containerd[1493]: time="2025-01-29T16:13:12.982692320Z" level=error msg="encountered an error cleaning up failed sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.983060 containerd[1493]: time="2025-01-29T16:13:12.982759177Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-jnchp,Uid:7c206eb9-9889-4da7-833e-4f8997709e6e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.983060 containerd[1493]: time="2025-01-29T16:13:12.982845352Z" level=error msg="encountered an error cleaning up failed sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.983833 kubelet[2663]: E0129 16:13:12.982982 2663 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.983833 kubelet[2663]: E0129 16:13:12.983043 2663 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" Jan 29 16:13:12.983833 kubelet[2663]: E0129 16:13:12.983066 2663 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" Jan 29 16:13:12.989858 kubelet[2663]: E0129 16:13:12.983113 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c7c6f64f5-jnchp_calico-apiserver(7c206eb9-9889-4da7-833e-4f8997709e6e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c7c6f64f5-jnchp_calico-apiserver(7c206eb9-9889-4da7-833e-4f8997709e6e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" podUID="7c206eb9-9889-4da7-833e-4f8997709e6e" Jan 29 16:13:12.989858 kubelet[2663]: E0129 16:13:12.984394 2663 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.989858 kubelet[2663]: E0129 16:13:12.984555 2663 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" Jan 29 16:13:12.990119 containerd[1493]: time="2025-01-29T16:13:12.983345416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85cfdd4458-rcthl,Uid:d480db07-6802-4f6a-b925-8f4c938e9f7b,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:12.990239 kubelet[2663]: E0129 16:13:12.984646 2663 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" Jan 29 16:13:12.990239 kubelet[2663]: E0129 16:13:12.984740 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-85cfdd4458-rcthl_calico-system(d480db07-6802-4f6a-b925-8f4c938e9f7b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-85cfdd4458-rcthl_calico-system(d480db07-6802-4f6a-b925-8f4c938e9f7b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" podUID="d480db07-6802-4f6a-b925-8f4c938e9f7b" Jan 29 16:13:13.019609 containerd[1493]: time="2025-01-29T16:13:13.019556279Z" level=error msg="Failed to destroy network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.020048 containerd[1493]: time="2025-01-29T16:13:13.020010332Z" level=error msg="encountered an error cleaning up failed sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.021245 containerd[1493]: time="2025-01-29T16:13:13.020073862Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-hshkw,Uid:8ae120f8-d2d4-415d-b245-151e25c130c4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.022793 kubelet[2663]: E0129 16:13:13.022386 2663 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.022793 kubelet[2663]: E0129 16:13:13.022453 2663 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" Jan 29 16:13:13.022793 kubelet[2663]: E0129 16:13:13.022486 2663 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" Jan 29 16:13:13.024412 kubelet[2663]: E0129 16:13:13.022543 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-c7c6f64f5-hshkw_calico-apiserver(8ae120f8-d2d4-415d-b245-151e25c130c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-c7c6f64f5-hshkw_calico-apiserver(8ae120f8-d2d4-415d-b245-151e25c130c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" podUID="8ae120f8-d2d4-415d-b245-151e25c130c4" Jan 29 16:13:13.023076 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e-shm.mount: Deactivated successfully. Jan 29 16:13:13.177475 systemd[1]: Created slice kubepods-besteffort-pode9d2429c_47e2_48a0_bc35_409f18438229.slice - libcontainer container kubepods-besteffort-pode9d2429c_47e2_48a0_bc35_409f18438229.slice. Jan 29 16:13:13.183002 containerd[1493]: time="2025-01-29T16:13:13.182955141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gd895,Uid:e9d2429c-47e2-48a0-bc35-409f18438229,Namespace:calico-system,Attempt:0,}" Jan 29 16:13:13.272117 containerd[1493]: time="2025-01-29T16:13:13.272037510Z" level=error msg="Failed to destroy network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.275792 containerd[1493]: time="2025-01-29T16:13:13.273304516Z" level=error msg="encountered an error cleaning up failed sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.275792 containerd[1493]: time="2025-01-29T16:13:13.273407601Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gd895,Uid:e9d2429c-47e2-48a0-bc35-409f18438229,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.276006 kubelet[2663]: E0129 16:13:13.273687 2663 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.276006 kubelet[2663]: E0129 16:13:13.273764 2663 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gd895" Jan 29 16:13:13.276006 kubelet[2663]: E0129 16:13:13.273793 2663 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gd895" Jan 29 16:13:13.276592 kubelet[2663]: E0129 16:13:13.273845 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gd895_calico-system(e9d2429c-47e2-48a0-bc35-409f18438229)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gd895_calico-system(e9d2429c-47e2-48a0-bc35-409f18438229)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:13.277519 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787-shm.mount: Deactivated successfully. Jan 29 16:13:13.374973 kubelet[2663]: I0129 16:13:13.374926 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:13.380285 kubelet[2663]: I0129 16:13:13.379708 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:13.381493 containerd[1493]: time="2025-01-29T16:13:13.381440352Z" level=info msg="StopPodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\"" Jan 29 16:13:13.385540 containerd[1493]: time="2025-01-29T16:13:13.384464080Z" level=info msg="Ensure that sandbox 31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e in task-service has been cleanup successfully" Jan 29 16:13:13.388044 containerd[1493]: time="2025-01-29T16:13:13.388005164Z" level=info msg="StopPodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\"" Jan 29 16:13:13.389495 kubelet[2663]: I0129 16:13:13.388648 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:13.389639 containerd[1493]: time="2025-01-29T16:13:13.388865292Z" level=info msg="Ensure that sandbox d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2 in task-service has been cleanup successfully" Jan 29 16:13:13.389950 containerd[1493]: time="2025-01-29T16:13:13.389912940Z" level=info msg="StopPodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\"" Jan 29 16:13:13.390341 containerd[1493]: time="2025-01-29T16:13:13.390311588Z" level=info msg="Ensure that sandbox b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b in task-service has been cleanup successfully" Jan 29 16:13:13.393890 kubelet[2663]: I0129 16:13:13.393867 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:13.396492 containerd[1493]: time="2025-01-29T16:13:13.396434456Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" Jan 29 16:13:13.396794 containerd[1493]: time="2025-01-29T16:13:13.396641430Z" level=info msg="Ensure that sandbox 1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6 in task-service has been cleanup successfully" Jan 29 16:13:13.398895 kubelet[2663]: I0129 16:13:13.398870 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:13.400277 containerd[1493]: time="2025-01-29T16:13:13.399914514Z" level=info msg="StopPodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\"" Jan 29 16:13:13.401220 containerd[1493]: time="2025-01-29T16:13:13.401091959Z" level=info msg="Ensure that sandbox 9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede in task-service has been cleanup successfully" Jan 29 16:13:13.409432 kubelet[2663]: I0129 16:13:13.409276 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:13.411603 containerd[1493]: time="2025-01-29T16:13:13.410821150Z" level=info msg="StopPodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\"" Jan 29 16:13:13.411603 containerd[1493]: time="2025-01-29T16:13:13.411068804Z" level=info msg="Ensure that sandbox 3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787 in task-service has been cleanup successfully" Jan 29 16:13:13.490220 containerd[1493]: time="2025-01-29T16:13:13.489510767Z" level=error msg="StopPodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" failed" error="failed to destroy network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.490511 kubelet[2663]: E0129 16:13:13.489828 2663 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:13.490511 kubelet[2663]: E0129 16:13:13.489911 2663 kuberuntime_manager.go:1477] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2"} Jan 29 16:13:13.490511 kubelet[2663]: E0129 16:13:13.490044 2663 kuberuntime_manager.go:1077] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"ba3c4db0-9089-4994-8c25-05651ef8025d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 29 16:13:13.490511 kubelet[2663]: E0129 16:13:13.490083 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"ba3c4db0-9089-4994-8c25-05651ef8025d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-hvt88" podUID="ba3c4db0-9089-4994-8c25-05651ef8025d" Jan 29 16:13:13.498039 containerd[1493]: time="2025-01-29T16:13:13.497988504Z" level=error msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" failed" error="failed to destroy network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.498451 kubelet[2663]: E0129 16:13:13.498413 2663 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:13.498451 kubelet[2663]: E0129 16:13:13.498463 2663 kuberuntime_manager.go:1477] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6"} Jan 29 16:13:13.498913 kubelet[2663]: E0129 16:13:13.498505 2663 kuberuntime_manager.go:1077] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"d480db07-6802-4f6a-b925-8f4c938e9f7b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 29 16:13:13.498913 kubelet[2663]: E0129 16:13:13.498532 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"d480db07-6802-4f6a-b925-8f4c938e9f7b\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" podUID="d480db07-6802-4f6a-b925-8f4c938e9f7b" Jan 29 16:13:13.526248 containerd[1493]: time="2025-01-29T16:13:13.525670240Z" level=error msg="StopPodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" failed" error="failed to destroy network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.526414 kubelet[2663]: E0129 16:13:13.525968 2663 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:13.526414 kubelet[2663]: E0129 16:13:13.526026 2663 kuberuntime_manager.go:1477] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e"} Jan 29 16:13:13.526414 kubelet[2663]: E0129 16:13:13.526068 2663 kuberuntime_manager.go:1077] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"8ae120f8-d2d4-415d-b245-151e25c130c4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 29 16:13:13.526414 kubelet[2663]: E0129 16:13:13.526098 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"8ae120f8-d2d4-415d-b245-151e25c130c4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" podUID="8ae120f8-d2d4-415d-b245-151e25c130c4" Jan 29 16:13:13.534023 containerd[1493]: time="2025-01-29T16:13:13.533979474Z" level=error msg="StopPodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" failed" error="failed to destroy network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.534664 kubelet[2663]: E0129 16:13:13.534177 2663 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:13.534664 kubelet[2663]: E0129 16:13:13.534254 2663 kuberuntime_manager.go:1477] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede"} Jan 29 16:13:13.534664 kubelet[2663]: E0129 16:13:13.534313 2663 kuberuntime_manager.go:1077] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"424af8c2-52c0-4e44-8235-7c2d50b2f3f1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 29 16:13:13.534664 kubelet[2663]: E0129 16:13:13.534386 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"424af8c2-52c0-4e44-8235-7c2d50b2f3f1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9s2kl" podUID="424af8c2-52c0-4e44-8235-7c2d50b2f3f1" Jan 29 16:13:13.538486 containerd[1493]: time="2025-01-29T16:13:13.538393677Z" level=error msg="StopPodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" failed" error="failed to destroy network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.539789 kubelet[2663]: E0129 16:13:13.538585 2663 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:13.539789 kubelet[2663]: E0129 16:13:13.538636 2663 kuberuntime_manager.go:1477] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b"} Jan 29 16:13:13.539789 kubelet[2663]: E0129 16:13:13.538671 2663 kuberuntime_manager.go:1077] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7c206eb9-9889-4da7-833e-4f8997709e6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 29 16:13:13.539789 kubelet[2663]: E0129 16:13:13.538696 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7c206eb9-9889-4da7-833e-4f8997709e6e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" podUID="7c206eb9-9889-4da7-833e-4f8997709e6e" Jan 29 16:13:13.545526 containerd[1493]: time="2025-01-29T16:13:13.545343314Z" level=error msg="StopPodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" failed" error="failed to destroy network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 29 16:13:13.545613 kubelet[2663]: E0129 16:13:13.545543 2663 log.go:32] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:13.545613 kubelet[2663]: E0129 16:13:13.545586 2663 kuberuntime_manager.go:1477] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787"} Jan 29 16:13:13.545757 kubelet[2663]: E0129 16:13:13.545644 2663 kuberuntime_manager.go:1077] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"e9d2429c-47e2-48a0-bc35-409f18438229\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 29 16:13:13.545757 kubelet[2663]: E0129 16:13:13.545673 2663 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"e9d2429c-47e2-48a0-bc35-409f18438229\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gd895" podUID="e9d2429c-47e2-48a0-bc35-409f18438229" Jan 29 16:13:16.727644 kubelet[2663]: I0129 16:13:16.726335 2663 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 16:13:22.728415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1236013552.mount: Deactivated successfully. Jan 29 16:13:22.828975 containerd[1493]: time="2025-01-29T16:13:22.828830663Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 29 16:13:22.833683 containerd[1493]: time="2025-01-29T16:13:22.821513667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:22.843612 containerd[1493]: time="2025-01-29T16:13:22.843556023Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:22.845103 containerd[1493]: time="2025-01-29T16:13:22.844722243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:22.846992 containerd[1493]: time="2025-01-29T16:13:22.846951702Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 10.47384354s" Jan 29 16:13:22.847205 containerd[1493]: time="2025-01-29T16:13:22.847146747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 29 16:13:22.917631 containerd[1493]: time="2025-01-29T16:13:22.917585737Z" level=info msg="CreateContainer within sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 29 16:13:22.968778 containerd[1493]: time="2025-01-29T16:13:22.968666321Z" level=info msg="CreateContainer within sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\"" Jan 29 16:13:22.973020 containerd[1493]: time="2025-01-29T16:13:22.972982358Z" level=info msg="StartContainer for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\"" Jan 29 16:13:23.208429 systemd[1]: Started cri-containerd-3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395.scope - libcontainer container 3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395. Jan 29 16:13:23.308309 containerd[1493]: time="2025-01-29T16:13:23.306793418Z" level=info msg="StartContainer for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" returns successfully" Jan 29 16:13:23.556380 kubelet[2663]: I0129 16:13:23.543278 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-p7vmg" podStartSLOduration=2.5022753250000003 podStartE2EDuration="26.526026512s" podCreationTimestamp="2025-01-29 16:12:57 +0000 UTC" firstStartedPulling="2025-01-29 16:12:58.824617305 +0000 UTC m=+14.843935100" lastFinishedPulling="2025-01-29 16:13:22.848368495 +0000 UTC m=+38.867686287" observedRunningTime="2025-01-29 16:13:23.5150767 +0000 UTC m=+39.534394491" watchObservedRunningTime="2025-01-29 16:13:23.526026512 +0000 UTC m=+39.545344316" Jan 29 16:13:23.639859 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 29 16:13:23.641360 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 29 16:13:24.174859 containerd[1493]: time="2025-01-29T16:13:24.174773695Z" level=info msg="StopPodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\"" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.308 [INFO][3851] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.309 [INFO][3851] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" iface="eth0" netns="/var/run/netns/cni-92341c98-d35b-e571-cf12-3020ab7bb0e0" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.309 [INFO][3851] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" iface="eth0" netns="/var/run/netns/cni-92341c98-d35b-e571-cf12-3020ab7bb0e0" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.311 [INFO][3851] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" iface="eth0" netns="/var/run/netns/cni-92341c98-d35b-e571-cf12-3020ab7bb0e0" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.311 [INFO][3851] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.311 [INFO][3851] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.530 [INFO][3857] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.535 [INFO][3857] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.536 [INFO][3857] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.556 [WARNING][3857] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.556 [INFO][3857] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.558 [INFO][3857] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:24.564528 containerd[1493]: 2025-01-29 16:13:24.561 [INFO][3851] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:24.569884 systemd[1]: run-netns-cni\x2d92341c98\x2dd35b\x2de571\x2dcf12\x2d3020ab7bb0e0.mount: Deactivated successfully. Jan 29 16:13:24.576887 containerd[1493]: time="2025-01-29T16:13:24.576382435Z" level=info msg="TearDown network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" successfully" Jan 29 16:13:24.576887 containerd[1493]: time="2025-01-29T16:13:24.576455822Z" level=info msg="StopPodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" returns successfully" Jan 29 16:13:24.577910 containerd[1493]: time="2025-01-29T16:13:24.577859416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-hshkw,Uid:8ae120f8-d2d4-415d-b245-151e25c130c4,Namespace:calico-apiserver,Attempt:1,}" Jan 29 16:13:24.808303 systemd-networkd[1427]: cali68ee9c7f433: Link UP Jan 29 16:13:24.808676 systemd-networkd[1427]: cali68ee9c7f433: Gained carrier Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.664 [INFO][3884] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.684 [INFO][3884] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0 calico-apiserver-c7c6f64f5- calico-apiserver 8ae120f8-d2d4-415d-b245-151e25c130c4 810 0 2025-01-29 16:12:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c7c6f64f5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com calico-apiserver-c7c6f64f5-hshkw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali68ee9c7f433 [] []}} ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.684 [INFO][3884] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.732 [INFO][3894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" HandleID="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.744 [INFO][3894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" HandleID="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003196e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"calico-apiserver-c7c6f64f5-hshkw", "timestamp":"2025-01-29 16:13:24.732899291 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.745 [INFO][3894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.745 [INFO][3894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.745 [INFO][3894] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.748 [INFO][3894] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.757 [INFO][3894] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.763 [INFO][3894] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.766 [INFO][3894] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.769 [INFO][3894] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.769 [INFO][3894] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.771 [INFO][3894] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9 Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.777 [INFO][3894] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.784 [INFO][3894] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.129/26] block=192.168.17.128/26 handle="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.785 [INFO][3894] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.129/26] handle="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.785 [INFO][3894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:24.830830 containerd[1493]: 2025-01-29 16:13:24.785 [INFO][3894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.129/26] IPv6=[] ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" HandleID="k8s-pod-network.f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.832011 containerd[1493]: 2025-01-29 16:13:24.789 [INFO][3884] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ae120f8-d2d4-415d-b245-151e25c130c4", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-c7c6f64f5-hshkw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68ee9c7f433", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:24.832011 containerd[1493]: 2025-01-29 16:13:24.789 [INFO][3884] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.129/32] ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.832011 containerd[1493]: 2025-01-29 16:13:24.789 [INFO][3884] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68ee9c7f433 ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.832011 containerd[1493]: 2025-01-29 16:13:24.805 [INFO][3884] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.832011 containerd[1493]: 2025-01-29 16:13:24.806 [INFO][3884] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ae120f8-d2d4-415d-b245-151e25c130c4", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9", Pod:"calico-apiserver-c7c6f64f5-hshkw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68ee9c7f433", MAC:"b6:37:eb:54:1e:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:24.832011 containerd[1493]: 2025-01-29 16:13:24.821 [INFO][3884] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-hshkw" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:24.875315 containerd[1493]: time="2025-01-29T16:13:24.874858899Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:24.875315 containerd[1493]: time="2025-01-29T16:13:24.874963612Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:24.875315 containerd[1493]: time="2025-01-29T16:13:24.875001193Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:24.875315 containerd[1493]: time="2025-01-29T16:13:24.875203358Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:24.915777 systemd[1]: Started cri-containerd-f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9.scope - libcontainer container f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9. Jan 29 16:13:24.986076 containerd[1493]: time="2025-01-29T16:13:24.985672766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-hshkw,Uid:8ae120f8-d2d4-415d-b245-151e25c130c4,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9\"" Jan 29 16:13:24.988544 containerd[1493]: time="2025-01-29T16:13:24.988307866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 29 16:13:25.171493 containerd[1493]: time="2025-01-29T16:13:25.170739730Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.273 [INFO][4009] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.273 [INFO][4009] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" iface="eth0" netns="/var/run/netns/cni-fd649d6e-4491-bdcd-f847-b362e1043e51" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.274 [INFO][4009] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" iface="eth0" netns="/var/run/netns/cni-fd649d6e-4491-bdcd-f847-b362e1043e51" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.274 [INFO][4009] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" iface="eth0" netns="/var/run/netns/cni-fd649d6e-4491-bdcd-f847-b362e1043e51" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.274 [INFO][4009] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.274 [INFO][4009] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.322 [INFO][4044] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.323 [INFO][4044] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.323 [INFO][4044] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.335 [WARNING][4044] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.335 [INFO][4044] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.337 [INFO][4044] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:25.342110 containerd[1493]: 2025-01-29 16:13:25.339 [INFO][4009] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:25.348192 containerd[1493]: time="2025-01-29T16:13:25.345825285Z" level=info msg="TearDown network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" successfully" Jan 29 16:13:25.348192 containerd[1493]: time="2025-01-29T16:13:25.345865473Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" returns successfully" Jan 29 16:13:25.347753 systemd[1]: run-netns-cni\x2dfd649d6e\x2d4491\x2dbdcd\x2df847\x2db362e1043e51.mount: Deactivated successfully. Jan 29 16:13:25.351588 containerd[1493]: time="2025-01-29T16:13:25.351182583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85cfdd4458-rcthl,Uid:d480db07-6802-4f6a-b925-8f4c938e9f7b,Namespace:calico-system,Attempt:1,}" Jan 29 16:13:25.707162 systemd-networkd[1427]: calie19d29f925e: Link UP Jan 29 16:13:25.712302 systemd-networkd[1427]: calie19d29f925e: Gained carrier Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.461 [INFO][4063] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.490 [INFO][4063] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0 calico-kube-controllers-85cfdd4458- calico-system d480db07-6802-4f6a-b925-8f4c938e9f7b 821 0 2025-01-29 16:12:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:85cfdd4458 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com calico-kube-controllers-85cfdd4458-rcthl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie19d29f925e [] []}} ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.491 [INFO][4063] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.634 [INFO][4077] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.654 [INFO][4077] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026b560), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"calico-kube-controllers-85cfdd4458-rcthl", "timestamp":"2025-01-29 16:13:25.634607982 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.654 [INFO][4077] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.654 [INFO][4077] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.655 [INFO][4077] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.658 [INFO][4077] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.664 [INFO][4077] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.672 [INFO][4077] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.676 [INFO][4077] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.679 [INFO][4077] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.679 [INFO][4077] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.683 [INFO][4077] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.690 [INFO][4077] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.697 [INFO][4077] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.130/26] block=192.168.17.128/26 handle="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.697 [INFO][4077] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.130/26] handle="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.697 [INFO][4077] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:25.748140 containerd[1493]: 2025-01-29 16:13:25.697 [INFO][4077] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.130/26] IPv6=[] ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.750981 containerd[1493]: 2025-01-29 16:13:25.701 [INFO][4063] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0", GenerateName:"calico-kube-controllers-85cfdd4458-", Namespace:"calico-system", SelfLink:"", UID:"d480db07-6802-4f6a-b925-8f4c938e9f7b", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85cfdd4458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-85cfdd4458-rcthl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.17.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie19d29f925e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:25.750981 containerd[1493]: 2025-01-29 16:13:25.701 [INFO][4063] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.130/32] ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.750981 containerd[1493]: 2025-01-29 16:13:25.701 [INFO][4063] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie19d29f925e ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.750981 containerd[1493]: 2025-01-29 16:13:25.711 [INFO][4063] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.750981 containerd[1493]: 2025-01-29 16:13:25.712 [INFO][4063] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0", GenerateName:"calico-kube-controllers-85cfdd4458-", Namespace:"calico-system", SelfLink:"", UID:"d480db07-6802-4f6a-b925-8f4c938e9f7b", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85cfdd4458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca", Pod:"calico-kube-controllers-85cfdd4458-rcthl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.17.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie19d29f925e", MAC:"1e:17:88:54:17:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:25.750981 containerd[1493]: 2025-01-29 16:13:25.730 [INFO][4063] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Namespace="calico-system" Pod="calico-kube-controllers-85cfdd4458-rcthl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:25.829671 containerd[1493]: time="2025-01-29T16:13:25.829447956Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:25.829671 containerd[1493]: time="2025-01-29T16:13:25.829602346Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:25.829671 containerd[1493]: time="2025-01-29T16:13:25.829627988Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:25.830496 containerd[1493]: time="2025-01-29T16:13:25.829887281Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:25.873368 systemd[1]: Started cri-containerd-d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca.scope - libcontainer container d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca. Jan 29 16:13:26.029285 containerd[1493]: time="2025-01-29T16:13:26.027717239Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-85cfdd4458-rcthl,Uid:d480db07-6802-4f6a-b925-8f4c938e9f7b,Namespace:calico-system,Attempt:1,} returns sandbox id \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\"" Jan 29 16:13:26.154375 kernel: bpftool[4192]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 29 16:13:26.508631 systemd-networkd[1427]: vxlan.calico: Link UP Jan 29 16:13:26.508642 systemd-networkd[1427]: vxlan.calico: Gained carrier Jan 29 16:13:26.540924 systemd-networkd[1427]: cali68ee9c7f433: Gained IPv6LL Jan 29 16:13:27.171153 containerd[1493]: time="2025-01-29T16:13:27.171087649Z" level=info msg="StopPodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\"" Jan 29 16:13:27.171795 containerd[1493]: time="2025-01-29T16:13:27.171451163Z" level=info msg="StopPodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\"" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.357 [INFO][4304] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.358 [INFO][4304] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" iface="eth0" netns="/var/run/netns/cni-ec90abec-f1ee-c2a5-f98e-33b145f4585b" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.359 [INFO][4304] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" iface="eth0" netns="/var/run/netns/cni-ec90abec-f1ee-c2a5-f98e-33b145f4585b" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.360 [INFO][4304] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" iface="eth0" netns="/var/run/netns/cni-ec90abec-f1ee-c2a5-f98e-33b145f4585b" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.360 [INFO][4304] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.360 [INFO][4304] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.447 [INFO][4324] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.448 [INFO][4324] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.448 [INFO][4324] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.464 [WARNING][4324] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.464 [INFO][4324] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.467 [INFO][4324] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:27.474773 containerd[1493]: 2025-01-29 16:13:27.470 [INFO][4304] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:27.478833 containerd[1493]: time="2025-01-29T16:13:27.475378134Z" level=info msg="TearDown network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" successfully" Jan 29 16:13:27.478833 containerd[1493]: time="2025-01-29T16:13:27.475454741Z" level=info msg="StopPodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" returns successfully" Jan 29 16:13:27.477306 systemd[1]: run-netns-cni\x2dec90abec\x2df1ee\x2dc2a5\x2df98e\x2d33b145f4585b.mount: Deactivated successfully. Jan 29 16:13:27.482277 containerd[1493]: time="2025-01-29T16:13:27.480868915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-jnchp,Uid:7c206eb9-9889-4da7-833e-4f8997709e6e,Namespace:calico-apiserver,Attempt:1,}" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.347 [INFO][4308] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.349 [INFO][4308] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" iface="eth0" netns="/var/run/netns/cni-dd7886cc-1855-509f-2bf1-12a3f0cd298d" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.353 [INFO][4308] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" iface="eth0" netns="/var/run/netns/cni-dd7886cc-1855-509f-2bf1-12a3f0cd298d" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.357 [INFO][4308] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" iface="eth0" netns="/var/run/netns/cni-dd7886cc-1855-509f-2bf1-12a3f0cd298d" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.357 [INFO][4308] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.357 [INFO][4308] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.467 [INFO][4323] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.469 [INFO][4323] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.470 [INFO][4323] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.495 [WARNING][4323] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.495 [INFO][4323] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.502 [INFO][4323] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:27.521072 containerd[1493]: 2025-01-29 16:13:27.514 [INFO][4308] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:27.524604 containerd[1493]: time="2025-01-29T16:13:27.524496103Z" level=info msg="TearDown network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" successfully" Jan 29 16:13:27.526539 containerd[1493]: time="2025-01-29T16:13:27.524689873Z" level=info msg="StopPodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" returns successfully" Jan 29 16:13:27.526449 systemd[1]: run-netns-cni\x2ddd7886cc\x2d1855\x2d509f\x2d2bf1\x2d12a3f0cd298d.mount: Deactivated successfully. Jan 29 16:13:27.527695 containerd[1493]: time="2025-01-29T16:13:27.527654621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9s2kl,Uid:424af8c2-52c0-4e44-8235-7c2d50b2f3f1,Namespace:kube-system,Attempt:1,}" Jan 29 16:13:27.692415 systemd-networkd[1427]: calie19d29f925e: Gained IPv6LL Jan 29 16:13:27.756486 systemd-networkd[1427]: vxlan.calico: Gained IPv6LL Jan 29 16:13:27.871027 systemd-networkd[1427]: calibe150f923e9: Link UP Jan 29 16:13:27.872888 systemd-networkd[1427]: calibe150f923e9: Gained carrier Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.634 [INFO][4337] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0 calico-apiserver-c7c6f64f5- calico-apiserver 7c206eb9-9889-4da7-833e-4f8997709e6e 833 0 2025-01-29 16:12:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:c7c6f64f5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com calico-apiserver-c7c6f64f5-jnchp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibe150f923e9 [] []}} ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.634 [INFO][4337] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.749 [INFO][4360] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" HandleID="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.793 [INFO][4360] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" HandleID="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051ea0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"calico-apiserver-c7c6f64f5-jnchp", "timestamp":"2025-01-29 16:13:27.749476638 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.794 [INFO][4360] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.794 [INFO][4360] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.794 [INFO][4360] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.798 [INFO][4360] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.810 [INFO][4360] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.820 [INFO][4360] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.823 [INFO][4360] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.826 [INFO][4360] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.826 [INFO][4360] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.829 [INFO][4360] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811 Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.839 [INFO][4360] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.852 [INFO][4360] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.131/26] block=192.168.17.128/26 handle="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.854 [INFO][4360] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.131/26] handle="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.854 [INFO][4360] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:27.912106 containerd[1493]: 2025-01-29 16:13:27.854 [INFO][4360] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.131/26] IPv6=[] ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" HandleID="k8s-pod-network.3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.913490 containerd[1493]: 2025-01-29 16:13:27.861 [INFO][4337] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"7c206eb9-9889-4da7-833e-4f8997709e6e", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-c7c6f64f5-jnchp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe150f923e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:27.913490 containerd[1493]: 2025-01-29 16:13:27.862 [INFO][4337] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.131/32] ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.913490 containerd[1493]: 2025-01-29 16:13:27.863 [INFO][4337] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibe150f923e9 ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.913490 containerd[1493]: 2025-01-29 16:13:27.874 [INFO][4337] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:27.913490 containerd[1493]: 2025-01-29 16:13:27.876 [INFO][4337] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"7c206eb9-9889-4da7-833e-4f8997709e6e", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811", Pod:"calico-apiserver-c7c6f64f5-jnchp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe150f923e9", MAC:"2a:b5:03:54:4b:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:27.913490 containerd[1493]: 2025-01-29 16:13:27.903 [INFO][4337] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811" Namespace="calico-apiserver" Pod="calico-apiserver-c7c6f64f5-jnchp" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:28.004639 systemd-networkd[1427]: cali2d8bb57669c: Link UP Jan 29 16:13:28.004985 systemd-networkd[1427]: cali2d8bb57669c: Gained carrier Jan 29 16:13:28.039920 containerd[1493]: time="2025-01-29T16:13:28.038471375Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:28.039920 containerd[1493]: time="2025-01-29T16:13:28.038593799Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:28.039920 containerd[1493]: time="2025-01-29T16:13:28.038615020Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:28.044803 containerd[1493]: time="2025-01-29T16:13:28.042153057Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.665 [INFO][4346] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0 coredns-6f6b679f8f- kube-system 424af8c2-52c0-4e44-8235-7c2d50b2f3f1 832 0 2025-01-29 16:12:50 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com coredns-6f6b679f8f-9s2kl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2d8bb57669c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.666 [INFO][4346] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.763 [INFO][4364] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" HandleID="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.797 [INFO][4364] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" HandleID="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003194c0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"coredns-6f6b679f8f-9s2kl", "timestamp":"2025-01-29 16:13:27.763836624 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.798 [INFO][4364] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.854 [INFO][4364] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.854 [INFO][4364] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.902 [INFO][4364] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.923 [INFO][4364] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.937 [INFO][4364] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.944 [INFO][4364] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.950 [INFO][4364] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.951 [INFO][4364] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.955 [INFO][4364] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38 Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.969 [INFO][4364] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.986 [INFO][4364] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.132/26] block=192.168.17.128/26 handle="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.987 [INFO][4364] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.132/26] handle="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.987 [INFO][4364] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:28.060813 containerd[1493]: 2025-01-29 16:13:27.988 [INFO][4364] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.132/26] IPv6=[] ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" HandleID="k8s-pod-network.628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.063035 containerd[1493]: 2025-01-29 16:13:27.995 [INFO][4346] cni-plugin/k8s.go 386: Populated endpoint ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"424af8c2-52c0-4e44-8235-7c2d50b2f3f1", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"coredns-6f6b679f8f-9s2kl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2d8bb57669c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:28.063035 containerd[1493]: 2025-01-29 16:13:27.995 [INFO][4346] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.132/32] ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.063035 containerd[1493]: 2025-01-29 16:13:27.995 [INFO][4346] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2d8bb57669c ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.063035 containerd[1493]: 2025-01-29 16:13:28.007 [INFO][4346] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.063035 containerd[1493]: 2025-01-29 16:13:28.017 [INFO][4346] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"424af8c2-52c0-4e44-8235-7c2d50b2f3f1", ResourceVersion:"832", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38", Pod:"coredns-6f6b679f8f-9s2kl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2d8bb57669c", MAC:"ba:91:e7:d4:30:0a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:28.063035 containerd[1493]: 2025-01-29 16:13:28.053 [INFO][4346] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38" Namespace="kube-system" Pod="coredns-6f6b679f8f-9s2kl" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:28.114438 systemd[1]: Started cri-containerd-3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811.scope - libcontainer container 3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811. Jan 29 16:13:28.178500 containerd[1493]: time="2025-01-29T16:13:28.178452041Z" level=info msg="StopPodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\"" Jan 29 16:13:28.247826 containerd[1493]: time="2025-01-29T16:13:28.246950969Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:28.247826 containerd[1493]: time="2025-01-29T16:13:28.247138317Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:28.253458 containerd[1493]: time="2025-01-29T16:13:28.247163427Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:28.253458 containerd[1493]: time="2025-01-29T16:13:28.253054354Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:28.341418 systemd[1]: Started cri-containerd-628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38.scope - libcontainer container 628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38. Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.365 [INFO][4461] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.367 [INFO][4461] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" iface="eth0" netns="/var/run/netns/cni-19994588-8be2-3521-3a81-23b84f68c330" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.368 [INFO][4461] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" iface="eth0" netns="/var/run/netns/cni-19994588-8be2-3521-3a81-23b84f68c330" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.370 [INFO][4461] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" iface="eth0" netns="/var/run/netns/cni-19994588-8be2-3521-3a81-23b84f68c330" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.370 [INFO][4461] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.370 [INFO][4461] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.470 [INFO][4490] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.470 [INFO][4490] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.471 [INFO][4490] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.496 [WARNING][4490] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.496 [INFO][4490] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.508 [INFO][4490] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:28.549120 containerd[1493]: 2025-01-29 16:13:28.521 [INFO][4461] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:28.550870 containerd[1493]: time="2025-01-29T16:13:28.549488764Z" level=info msg="TearDown network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" successfully" Jan 29 16:13:28.550870 containerd[1493]: time="2025-01-29T16:13:28.549531700Z" level=info msg="StopPodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" returns successfully" Jan 29 16:13:28.557679 systemd[1]: run-netns-cni\x2d19994588\x2d8be2\x2d3521\x2d3a81\x2d23b84f68c330.mount: Deactivated successfully. Jan 29 16:13:28.565615 containerd[1493]: time="2025-01-29T16:13:28.565574269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-hvt88,Uid:ba3c4db0-9089-4994-8c25-05651ef8025d,Namespace:kube-system,Attempt:1,}" Jan 29 16:13:28.612737 containerd[1493]: time="2025-01-29T16:13:28.612264904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-c7c6f64f5-jnchp,Uid:7c206eb9-9889-4da7-833e-4f8997709e6e,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811\"" Jan 29 16:13:28.644080 containerd[1493]: time="2025-01-29T16:13:28.644021494Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9s2kl,Uid:424af8c2-52c0-4e44-8235-7c2d50b2f3f1,Namespace:kube-system,Attempt:1,} returns sandbox id \"628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38\"" Jan 29 16:13:28.691045 containerd[1493]: time="2025-01-29T16:13:28.690359069Z" level=info msg="CreateContainer within sandbox \"628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 29 16:13:28.841794 containerd[1493]: time="2025-01-29T16:13:28.841564373Z" level=info msg="CreateContainer within sandbox \"628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7b8d30e3bdbc9f4ce8d59a9637659fa492a8b8bce24a38abbef6e05ea2853cf6\"" Jan 29 16:13:28.844518 containerd[1493]: time="2025-01-29T16:13:28.844463425Z" level=info msg="StartContainer for \"7b8d30e3bdbc9f4ce8d59a9637659fa492a8b8bce24a38abbef6e05ea2853cf6\"" Jan 29 16:13:28.964399 systemd[1]: Started cri-containerd-7b8d30e3bdbc9f4ce8d59a9637659fa492a8b8bce24a38abbef6e05ea2853cf6.scope - libcontainer container 7b8d30e3bdbc9f4ce8d59a9637659fa492a8b8bce24a38abbef6e05ea2853cf6. Jan 29 16:13:29.108948 containerd[1493]: time="2025-01-29T16:13:29.108754731Z" level=info msg="StartContainer for \"7b8d30e3bdbc9f4ce8d59a9637659fa492a8b8bce24a38abbef6e05ea2853cf6\" returns successfully" Jan 29 16:13:29.174886 containerd[1493]: time="2025-01-29T16:13:29.174386931Z" level=info msg="StopPodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\"" Jan 29 16:13:29.228491 systemd-networkd[1427]: cali2d8bb57669c: Gained IPv6LL Jan 29 16:13:29.425309 systemd-networkd[1427]: cali01fc5829f8c: Link UP Jan 29 16:13:29.430711 systemd-networkd[1427]: cali01fc5829f8c: Gained carrier Jan 29 16:13:29.481995 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3780139111.mount: Deactivated successfully. Jan 29 16:13:29.485263 systemd-networkd[1427]: calibe150f923e9: Gained IPv6LL Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:28.862 [INFO][4508] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0 coredns-6f6b679f8f- kube-system ba3c4db0-9089-4994-8c25-05651ef8025d 841 0 2025-01-29 16:12:50 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com coredns-6f6b679f8f-hvt88 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali01fc5829f8c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:28.864 [INFO][4508] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.084 [INFO][4531] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" HandleID="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.219 [INFO][4531] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" HandleID="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a4fe0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"coredns-6f6b679f8f-hvt88", "timestamp":"2025-01-29 16:13:29.084169759 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.220 [INFO][4531] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.220 [INFO][4531] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.220 [INFO][4531] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.233 [INFO][4531] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.246 [INFO][4531] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.263 [INFO][4531] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.267 [INFO][4531] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.275 [INFO][4531] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.275 [INFO][4531] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.282 [INFO][4531] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867 Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.301 [INFO][4531] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.387 [INFO][4531] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.133/26] block=192.168.17.128/26 handle="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.387 [INFO][4531] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.133/26] handle="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.387 [INFO][4531] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:29.500905 containerd[1493]: 2025-01-29 16:13:29.387 [INFO][4531] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.133/26] IPv6=[] ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" HandleID="k8s-pod-network.cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.503344 containerd[1493]: 2025-01-29 16:13:29.395 [INFO][4508] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"ba3c4db0-9089-4994-8c25-05651ef8025d", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"coredns-6f6b679f8f-hvt88", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali01fc5829f8c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:29.503344 containerd[1493]: 2025-01-29 16:13:29.397 [INFO][4508] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.133/32] ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.503344 containerd[1493]: 2025-01-29 16:13:29.397 [INFO][4508] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali01fc5829f8c ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.503344 containerd[1493]: 2025-01-29 16:13:29.441 [INFO][4508] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.503344 containerd[1493]: 2025-01-29 16:13:29.443 [INFO][4508] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"ba3c4db0-9089-4994-8c25-05651ef8025d", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867", Pod:"coredns-6f6b679f8f-hvt88", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali01fc5829f8c", MAC:"d2:8c:19:3c:32:96", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:29.503344 containerd[1493]: 2025-01-29 16:13:29.493 [INFO][4508] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867" Namespace="kube-system" Pod="coredns-6f6b679f8f-hvt88" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:29.667648 containerd[1493]: time="2025-01-29T16:13:29.667082642Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:29.667648 containerd[1493]: time="2025-01-29T16:13:29.667185816Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:29.667648 containerd[1493]: time="2025-01-29T16:13:29.667252074Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:29.667648 containerd[1493]: time="2025-01-29T16:13:29.667410892Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.488 [INFO][4594] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.490 [INFO][4594] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" iface="eth0" netns="/var/run/netns/cni-b7f9ba8a-7598-4184-1dc1-cc7447667de7" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.491 [INFO][4594] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" iface="eth0" netns="/var/run/netns/cni-b7f9ba8a-7598-4184-1dc1-cc7447667de7" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.492 [INFO][4594] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" iface="eth0" netns="/var/run/netns/cni-b7f9ba8a-7598-4184-1dc1-cc7447667de7" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.492 [INFO][4594] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.492 [INFO][4594] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.630 [INFO][4606] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.630 [INFO][4606] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.630 [INFO][4606] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.689 [WARNING][4606] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.691 [INFO][4606] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.701 [INFO][4606] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:29.740914 containerd[1493]: 2025-01-29 16:13:29.728 [INFO][4594] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:29.745398 containerd[1493]: time="2025-01-29T16:13:29.743715756Z" level=info msg="TearDown network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" successfully" Jan 29 16:13:29.745398 containerd[1493]: time="2025-01-29T16:13:29.743754802Z" level=info msg="StopPodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" returns successfully" Jan 29 16:13:29.747314 systemd[1]: Started cri-containerd-cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867.scope - libcontainer container cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867. Jan 29 16:13:29.753806 systemd[1]: run-netns-cni\x2db7f9ba8a\x2d7598\x2d4184\x2d1dc1\x2dcc7447667de7.mount: Deactivated successfully. Jan 29 16:13:29.757557 containerd[1493]: time="2025-01-29T16:13:29.757518324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gd895,Uid:e9d2429c-47e2-48a0-bc35-409f18438229,Namespace:calico-system,Attempt:1,}" Jan 29 16:13:29.871131 containerd[1493]: time="2025-01-29T16:13:29.871081482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-hvt88,Uid:ba3c4db0-9089-4994-8c25-05651ef8025d,Namespace:kube-system,Attempt:1,} returns sandbox id \"cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867\"" Jan 29 16:13:29.882594 containerd[1493]: time="2025-01-29T16:13:29.882384299Z" level=info msg="CreateContainer within sandbox \"cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 29 16:13:29.916734 containerd[1493]: time="2025-01-29T16:13:29.916683075Z" level=info msg="CreateContainer within sandbox \"cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"60e58cbb95e1a272f9d0e85a50580f846756da8d7ca95b1a25740b5c4208ee7b\"" Jan 29 16:13:29.918771 containerd[1493]: time="2025-01-29T16:13:29.918715562Z" level=info msg="StartContainer for \"60e58cbb95e1a272f9d0e85a50580f846756da8d7ca95b1a25740b5c4208ee7b\"" Jan 29 16:13:29.998762 systemd[1]: Started cri-containerd-60e58cbb95e1a272f9d0e85a50580f846756da8d7ca95b1a25740b5c4208ee7b.scope - libcontainer container 60e58cbb95e1a272f9d0e85a50580f846756da8d7ca95b1a25740b5c4208ee7b. Jan 29 16:13:30.099200 containerd[1493]: time="2025-01-29T16:13:30.098969815Z" level=info msg="StartContainer for \"60e58cbb95e1a272f9d0e85a50580f846756da8d7ca95b1a25740b5c4208ee7b\" returns successfully" Jan 29 16:13:30.273312 systemd-networkd[1427]: calicca68b5c8d8: Link UP Jan 29 16:13:30.277205 systemd-networkd[1427]: calicca68b5c8d8: Gained carrier Jan 29 16:13:30.308294 kubelet[2663]: I0129 16:13:30.307835 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-9s2kl" podStartSLOduration=40.306784705 podStartE2EDuration="40.306784705s" podCreationTimestamp="2025-01-29 16:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:13:29.694346436 +0000 UTC m=+45.713664245" watchObservedRunningTime="2025-01-29 16:13:30.306784705 +0000 UTC m=+46.326102525" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:29.928 [INFO][4659] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0 csi-node-driver- calico-system e9d2429c-47e2-48a0-bc35-409f18438229 853 0 2025-01-29 16:12:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:56747c9949 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com csi-node-driver-gd895 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicca68b5c8d8 [] []}} ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:29.928 [INFO][4659] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.073 [INFO][4688] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" HandleID="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.100 [INFO][4688] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" HandleID="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00056fd10), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"csi-node-driver-gd895", "timestamp":"2025-01-29 16:13:30.073099398 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.100 [INFO][4688] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.100 [INFO][4688] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.100 [INFO][4688] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.107 [INFO][4688] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.196 [INFO][4688] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.208 [INFO][4688] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.213 [INFO][4688] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.223 [INFO][4688] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.223 [INFO][4688] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.236 [INFO][4688] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396 Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.251 [INFO][4688] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.261 [INFO][4688] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.134/26] block=192.168.17.128/26 handle="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.261 [INFO][4688] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.134/26] handle="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.261 [INFO][4688] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:30.311116 containerd[1493]: 2025-01-29 16:13:30.261 [INFO][4688] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.134/26] IPv6=[] ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" HandleID="k8s-pod-network.b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.314553 containerd[1493]: 2025-01-29 16:13:30.266 [INFO][4659] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e9d2429c-47e2-48a0-bc35-409f18438229", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-gd895", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.17.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicca68b5c8d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:30.314553 containerd[1493]: 2025-01-29 16:13:30.266 [INFO][4659] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.134/32] ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.314553 containerd[1493]: 2025-01-29 16:13:30.266 [INFO][4659] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicca68b5c8d8 ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.314553 containerd[1493]: 2025-01-29 16:13:30.277 [INFO][4659] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.314553 containerd[1493]: 2025-01-29 16:13:30.280 [INFO][4659] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e9d2429c-47e2-48a0-bc35-409f18438229", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396", Pod:"csi-node-driver-gd895", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.17.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicca68b5c8d8", MAC:"62:cf:b5:8d:fd:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:30.314553 containerd[1493]: 2025-01-29 16:13:30.304 [INFO][4659] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396" Namespace="calico-system" Pod="csi-node-driver-gd895" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:30.406683 containerd[1493]: time="2025-01-29T16:13:30.405840402Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:30.406683 containerd[1493]: time="2025-01-29T16:13:30.406616906Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:30.408872 containerd[1493]: time="2025-01-29T16:13:30.408591690Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:30.409658 containerd[1493]: time="2025-01-29T16:13:30.409595646Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:30.457134 systemd[1]: Started cri-containerd-b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396.scope - libcontainer container b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396. Jan 29 16:13:30.566831 containerd[1493]: time="2025-01-29T16:13:30.566782181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gd895,Uid:e9d2429c-47e2-48a0-bc35-409f18438229,Namespace:calico-system,Attempt:1,} returns sandbox id \"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396\"" Jan 29 16:13:30.649760 kubelet[2663]: I0129 16:13:30.649356 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-hvt88" podStartSLOduration=40.649326678 podStartE2EDuration="40.649326678s" podCreationTimestamp="2025-01-29 16:12:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:13:30.620630904 +0000 UTC m=+46.639948727" watchObservedRunningTime="2025-01-29 16:13:30.649326678 +0000 UTC m=+46.668644482" Jan 29 16:13:30.769340 containerd[1493]: time="2025-01-29T16:13:30.769019252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:30.771021 containerd[1493]: time="2025-01-29T16:13:30.770953133Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 29 16:13:30.772093 containerd[1493]: time="2025-01-29T16:13:30.772033217Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:30.775252 containerd[1493]: time="2025-01-29T16:13:30.775192082Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:30.776469 containerd[1493]: time="2025-01-29T16:13:30.776366954Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 5.787984122s" Jan 29 16:13:30.776469 containerd[1493]: time="2025-01-29T16:13:30.776422291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 29 16:13:30.777842 containerd[1493]: time="2025-01-29T16:13:30.777779676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 29 16:13:30.780451 containerd[1493]: time="2025-01-29T16:13:30.780417808Z" level=info msg="CreateContainer within sandbox \"f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 29 16:13:30.802911 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3606009063.mount: Deactivated successfully. Jan 29 16:13:30.806609 containerd[1493]: time="2025-01-29T16:13:30.806493342Z" level=info msg="CreateContainer within sandbox \"f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"313ca8a941f1e142b0deb0f725c87eab59a9cb2fb7891f25d4fcb939276a0b1f\"" Jan 29 16:13:30.807373 containerd[1493]: time="2025-01-29T16:13:30.807163186Z" level=info msg="StartContainer for \"313ca8a941f1e142b0deb0f725c87eab59a9cb2fb7891f25d4fcb939276a0b1f\"" Jan 29 16:13:30.827806 systemd-networkd[1427]: cali01fc5829f8c: Gained IPv6LL Jan 29 16:13:30.869420 systemd[1]: Started cri-containerd-313ca8a941f1e142b0deb0f725c87eab59a9cb2fb7891f25d4fcb939276a0b1f.scope - libcontainer container 313ca8a941f1e142b0deb0f725c87eab59a9cb2fb7891f25d4fcb939276a0b1f. Jan 29 16:13:30.932012 containerd[1493]: time="2025-01-29T16:13:30.931558653Z" level=info msg="StartContainer for \"313ca8a941f1e142b0deb0f725c87eab59a9cb2fb7891f25d4fcb939276a0b1f\" returns successfully" Jan 29 16:13:31.638002 kubelet[2663]: I0129 16:13:31.637157 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c7c6f64f5-hshkw" podStartSLOduration=27.847292884 podStartE2EDuration="33.637112896s" podCreationTimestamp="2025-01-29 16:12:58 +0000 UTC" firstStartedPulling="2025-01-29 16:13:24.987815064 +0000 UTC m=+41.007132856" lastFinishedPulling="2025-01-29 16:13:30.777635077 +0000 UTC m=+46.796952868" observedRunningTime="2025-01-29 16:13:31.620908835 +0000 UTC m=+47.640226639" watchObservedRunningTime="2025-01-29 16:13:31.637112896 +0000 UTC m=+47.656430698" Jan 29 16:13:31.851507 systemd-networkd[1427]: calicca68b5c8d8: Gained IPv6LL Jan 29 16:13:32.615244 kubelet[2663]: I0129 16:13:32.614642 2663 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 16:13:34.222220 containerd[1493]: time="2025-01-29T16:13:34.221517488Z" level=info msg="StopContainer for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" with timeout 300 (s)" Jan 29 16:13:34.225819 containerd[1493]: time="2025-01-29T16:13:34.225539888Z" level=info msg="Stop container \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" with signal terminated" Jan 29 16:13:34.442655 containerd[1493]: time="2025-01-29T16:13:34.442591753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:34.444316 containerd[1493]: time="2025-01-29T16:13:34.444143656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 29 16:13:34.445966 containerd[1493]: time="2025-01-29T16:13:34.445917970Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:34.450957 containerd[1493]: time="2025-01-29T16:13:34.450921661Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:34.453409 containerd[1493]: time="2025-01-29T16:13:34.453342106Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 3.67552208s" Jan 29 16:13:34.453409 containerd[1493]: time="2025-01-29T16:13:34.453391443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 29 16:13:34.457073 containerd[1493]: time="2025-01-29T16:13:34.457025625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 29 16:13:34.501050 containerd[1493]: time="2025-01-29T16:13:34.500865521Z" level=info msg="CreateContainer within sandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 29 16:13:34.541774 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1332013498.mount: Deactivated successfully. Jan 29 16:13:34.544717 containerd[1493]: time="2025-01-29T16:13:34.544536383Z" level=info msg="CreateContainer within sandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\"" Jan 29 16:13:34.548363 containerd[1493]: time="2025-01-29T16:13:34.548329441Z" level=info msg="StartContainer for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\"" Jan 29 16:13:34.698398 systemd[1]: Started cri-containerd-faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c.scope - libcontainer container faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c. Jan 29 16:13:34.836204 containerd[1493]: time="2025-01-29T16:13:34.834453637Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:34.838337 containerd[1493]: time="2025-01-29T16:13:34.838227807Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 29 16:13:34.845390 containerd[1493]: time="2025-01-29T16:13:34.845354579Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 388.281552ms" Jan 29 16:13:34.845718 containerd[1493]: time="2025-01-29T16:13:34.845668553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 29 16:13:34.849620 containerd[1493]: time="2025-01-29T16:13:34.849446060Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 29 16:13:34.858711 containerd[1493]: time="2025-01-29T16:13:34.858535554Z" level=info msg="CreateContainer within sandbox \"3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 29 16:13:34.912927 containerd[1493]: time="2025-01-29T16:13:34.912866774Z" level=info msg="StartContainer for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" returns successfully" Jan 29 16:13:34.913792 containerd[1493]: time="2025-01-29T16:13:34.913560982Z" level=info msg="CreateContainer within sandbox \"3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5f38527566c4d8885a052fd120daf81a90b55a25afb9cf98847b82f715a9db86\"" Jan 29 16:13:34.916037 containerd[1493]: time="2025-01-29T16:13:34.915993952Z" level=info msg="StartContainer for \"5f38527566c4d8885a052fd120daf81a90b55a25afb9cf98847b82f715a9db86\"" Jan 29 16:13:35.001495 systemd[1]: Started cri-containerd-5f38527566c4d8885a052fd120daf81a90b55a25afb9cf98847b82f715a9db86.scope - libcontainer container 5f38527566c4d8885a052fd120daf81a90b55a25afb9cf98847b82f715a9db86. Jan 29 16:13:35.012586 containerd[1493]: time="2025-01-29T16:13:35.008018021Z" level=info msg="StopContainer for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" with timeout 5 (s)" Jan 29 16:13:35.012586 containerd[1493]: time="2025-01-29T16:13:35.009497849Z" level=info msg="Stop container \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" with signal terminated" Jan 29 16:13:35.096220 systemd[1]: cri-containerd-3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395.scope: Deactivated successfully. Jan 29 16:13:35.096586 systemd[1]: cri-containerd-3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395.scope: Consumed 2.442s CPU time. Jan 29 16:13:35.161976 containerd[1493]: time="2025-01-29T16:13:35.161377507Z" level=info msg="StartContainer for \"5f38527566c4d8885a052fd120daf81a90b55a25afb9cf98847b82f715a9db86\" returns successfully" Jan 29 16:13:35.325740 containerd[1493]: time="2025-01-29T16:13:35.325330360Z" level=info msg="shim disconnected" id=3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395 namespace=k8s.io Jan 29 16:13:35.325740 containerd[1493]: time="2025-01-29T16:13:35.325503145Z" level=warning msg="cleaning up after shim disconnected" id=3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395 namespace=k8s.io Jan 29 16:13:35.325740 containerd[1493]: time="2025-01-29T16:13:35.325544455Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:35.381157 containerd[1493]: time="2025-01-29T16:13:35.379975199Z" level=info msg="StopContainer for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" returns successfully" Jan 29 16:13:35.389901 containerd[1493]: time="2025-01-29T16:13:35.389317498Z" level=info msg="StopPodSandbox for \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\"" Jan 29 16:13:35.389901 containerd[1493]: time="2025-01-29T16:13:35.389378787Z" level=info msg="Container to stop \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 29 16:13:35.389901 containerd[1493]: time="2025-01-29T16:13:35.389403457Z" level=info msg="Container to stop \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 29 16:13:35.389901 containerd[1493]: time="2025-01-29T16:13:35.389419448Z" level=info msg="Container to stop \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 29 16:13:35.403446 systemd[1]: cri-containerd-4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409.scope: Deactivated successfully. Jan 29 16:13:35.455849 containerd[1493]: time="2025-01-29T16:13:35.455213598Z" level=info msg="shim disconnected" id=4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409 namespace=k8s.io Jan 29 16:13:35.455849 containerd[1493]: time="2025-01-29T16:13:35.455296174Z" level=warning msg="cleaning up after shim disconnected" id=4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409 namespace=k8s.io Jan 29 16:13:35.455849 containerd[1493]: time="2025-01-29T16:13:35.455314542Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:35.483435 containerd[1493]: time="2025-01-29T16:13:35.483380377Z" level=warning msg="cleanup warnings time=\"2025-01-29T16:13:35Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 29 16:13:35.490337 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395-rootfs.mount: Deactivated successfully. Jan 29 16:13:35.491912 containerd[1493]: time="2025-01-29T16:13:35.490584235Z" level=info msg="TearDown network for sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" successfully" Jan 29 16:13:35.491912 containerd[1493]: time="2025-01-29T16:13:35.490615794Z" level=info msg="StopPodSandbox for \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" returns successfully" Jan 29 16:13:35.492170 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409-rootfs.mount: Deactivated successfully. Jan 29 16:13:35.492315 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409-shm.mount: Deactivated successfully. Jan 29 16:13:35.595649 kubelet[2663]: I0129 16:13:35.595003 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-run-calico\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.595649 kubelet[2663]: I0129 16:13:35.595090 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-lib-modules\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.595649 kubelet[2663]: I0129 16:13:35.595131 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-tigera-ca-bundle\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.595649 kubelet[2663]: I0129 16:13:35.595156 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-net-dir\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.595649 kubelet[2663]: I0129 16:13:35.595202 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-bin-dir\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.595649 kubelet[2663]: I0129 16:13:35.595233 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-node-certs\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.599388 kubelet[2663]: I0129 16:13:35.595295 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-policysync\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.599388 kubelet[2663]: I0129 16:13:35.595334 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd9ph\" (UniqueName: \"kubernetes.io/projected/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-kube-api-access-bd9ph\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.599388 kubelet[2663]: I0129 16:13:35.595366 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-lib-calico\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.599388 kubelet[2663]: I0129 16:13:35.595392 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-log-dir\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.599388 kubelet[2663]: I0129 16:13:35.595421 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-flexvol-driver-host\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.599388 kubelet[2663]: I0129 16:13:35.595444 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-xtables-lock\") pod \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\" (UID: \"ac778ba4-a7a1-4547-bfec-fdc4b3b030c3\") " Jan 29 16:13:35.628690 kubelet[2663]: I0129 16:13:35.624980 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.637175 kubelet[2663]: I0129 16:13:35.635723 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.637175 kubelet[2663]: I0129 16:13:35.636746 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.637175 kubelet[2663]: E0129 16:13:35.636792 2663 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" containerName="calico-node" Jan 29 16:13:35.637175 kubelet[2663]: E0129 16:13:35.636831 2663 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" containerName="flexvol-driver" Jan 29 16:13:35.637175 kubelet[2663]: E0129 16:13:35.636844 2663 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" containerName="install-cni" Jan 29 16:13:35.637175 kubelet[2663]: I0129 16:13:35.636918 2663 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" containerName="calico-node" Jan 29 16:13:35.639679 kubelet[2663]: I0129 16:13:35.639652 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-policysync" (OuterVolumeSpecName: "policysync") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.640054 kubelet[2663]: I0129 16:13:35.639829 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.640054 kubelet[2663]: I0129 16:13:35.639871 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.659684 systemd[1]: var-lib-kubelet-pods-ac778ba4\x2da7a1\x2d4547\x2dbfec\x2dfdc4b3b030c3-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Jan 29 16:13:35.663758 kubelet[2663]: I0129 16:13:35.663716 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.663911 kubelet[2663]: I0129 16:13:35.663798 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.663911 kubelet[2663]: I0129 16:13:35.663853 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 29 16:13:35.673048 kubelet[2663]: I0129 16:13:35.672995 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-node-certs" (OuterVolumeSpecName: "node-certs") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:13:35.673159 kubelet[2663]: I0129 16:13:35.673122 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-kube-api-access-bd9ph" (OuterVolumeSpecName: "kube-api-access-bd9ph") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "kube-api-access-bd9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:13:35.673759 systemd[1]: var-lib-kubelet-pods-ac778ba4\x2da7a1\x2d4547\x2dbfec\x2dfdc4b3b030c3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dbd9ph.mount: Deactivated successfully. Jan 29 16:13:35.693981 systemd[1]: var-lib-kubelet-pods-ac778ba4\x2da7a1\x2d4547\x2dbfec\x2dfdc4b3b030c3-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Jan 29 16:13:35.697238 containerd[1493]: time="2025-01-29T16:13:35.695659098Z" level=info msg="StopContainer for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" with timeout 30 (s)" Jan 29 16:13:35.697360 kubelet[2663]: I0129 16:13:35.696478 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" (UID: "ac778ba4-a7a1-4547-bfec-fdc4b3b030c3"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:13:35.698085 systemd[1]: Created slice kubepods-besteffort-pod1097b7ed_9ec5_4d1d_9861_e504a1b3e4da.slice - libcontainer container kubepods-besteffort-pod1097b7ed_9ec5_4d1d_9861_e504a1b3e4da.slice. Jan 29 16:13:35.701078 containerd[1493]: time="2025-01-29T16:13:35.700382755Z" level=info msg="Stop container \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" with signal terminated" Jan 29 16:13:35.730951 kubelet[2663]: I0129 16:13:35.730492 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-var-run-calico\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.731129 kubelet[2663]: I0129 16:13:35.731107 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-cni-log-dir\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732136 kubelet[2663]: I0129 16:13:35.731301 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-xtables-lock\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732136 kubelet[2663]: I0129 16:13:35.731342 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-tigera-ca-bundle\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732136 kubelet[2663]: I0129 16:13:35.731382 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-var-lib-calico\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732136 kubelet[2663]: I0129 16:13:35.731430 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-lib-modules\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732434 kubelet[2663]: I0129 16:13:35.732218 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-cni-net-dir\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732434 kubelet[2663]: I0129 16:13:35.732296 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-cni-bin-dir\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732434 kubelet[2663]: I0129 16:13:35.732330 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-flexvol-driver-host\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732434 kubelet[2663]: I0129 16:13:35.732362 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxnf\" (UniqueName: \"kubernetes.io/projected/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-kube-api-access-pzxnf\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732434 kubelet[2663]: I0129 16:13:35.732391 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-policysync\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.732673 kubelet[2663]: I0129 16:13:35.732441 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1097b7ed-9ec5-4d1d-9861-e504a1b3e4da-node-certs\") pod \"calico-node-kv5nf\" (UID: \"1097b7ed-9ec5-4d1d-9861-e504a1b3e4da\") " pod="calico-system/calico-node-kv5nf" Jan 29 16:13:35.735340 kubelet[2663]: I0129 16:13:35.735089 2663 reconciler_common.go:288] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-flexvol-driver-host\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735340 kubelet[2663]: I0129 16:13:35.735123 2663 reconciler_common.go:288] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-xtables-lock\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735340 kubelet[2663]: I0129 16:13:35.735141 2663 reconciler_common.go:288] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-lib-modules\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735862 kubelet[2663]: I0129 16:13:35.735163 2663 reconciler_common.go:288] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-run-calico\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735940 kubelet[2663]: I0129 16:13:35.735883 2663 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-tigera-ca-bundle\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735940 kubelet[2663]: I0129 16:13:35.735903 2663 reconciler_common.go:288] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-net-dir\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735940 kubelet[2663]: I0129 16:13:35.735920 2663 reconciler_common.go:288] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-node-certs\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.735940 kubelet[2663]: I0129 16:13:35.735935 2663 reconciler_common.go:288] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-bin-dir\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.736140 kubelet[2663]: I0129 16:13:35.735969 2663 reconciler_common.go:288] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-policysync\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.736140 kubelet[2663]: I0129 16:13:35.735985 2663 reconciler_common.go:288] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-cni-log-dir\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.736140 kubelet[2663]: I0129 16:13:35.735999 2663 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-bd9ph\" (UniqueName: \"kubernetes.io/projected/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-kube-api-access-bd9ph\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.736140 kubelet[2663]: I0129 16:13:35.736016 2663 reconciler_common.go:288] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3-var-lib-calico\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:35.757219 kubelet[2663]: I0129 16:13:35.754718 2663 scope.go:117] "RemoveContainer" containerID="3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395" Jan 29 16:13:35.770390 systemd[1]: Removed slice kubepods-besteffort-podac778ba4_a7a1_4547_bfec_fdc4b3b030c3.slice - libcontainer container kubepods-besteffort-podac778ba4_a7a1_4547_bfec_fdc4b3b030c3.slice. Jan 29 16:13:35.770557 systemd[1]: kubepods-besteffort-podac778ba4_a7a1_4547_bfec_fdc4b3b030c3.slice: Consumed 3.265s CPU time. Jan 29 16:13:35.778449 containerd[1493]: time="2025-01-29T16:13:35.777694319Z" level=info msg="RemoveContainer for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\"" Jan 29 16:13:35.785375 systemd[1]: cri-containerd-faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c.scope: Deactivated successfully. Jan 29 16:13:35.796014 systemd[1]: cri-containerd-ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400.scope: Deactivated successfully. Jan 29 16:13:35.828208 containerd[1493]: time="2025-01-29T16:13:35.827612580Z" level=info msg="RemoveContainer for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" returns successfully" Jan 29 16:13:35.848756 kubelet[2663]: I0129 16:13:35.848256 2663 scope.go:117] "RemoveContainer" containerID="33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7" Jan 29 16:13:35.899085 containerd[1493]: time="2025-01-29T16:13:35.898742683Z" level=info msg="RemoveContainer for \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\"" Jan 29 16:13:35.921380 kubelet[2663]: I0129 16:13:35.921294 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-85cfdd4458-rcthl" podStartSLOduration=30.498880619 podStartE2EDuration="38.921244754s" podCreationTimestamp="2025-01-29 16:12:57 +0000 UTC" firstStartedPulling="2025-01-29 16:13:26.033311067 +0000 UTC m=+42.052628859" lastFinishedPulling="2025-01-29 16:13:34.455675202 +0000 UTC m=+50.474992994" observedRunningTime="2025-01-29 16:13:35.805836934 +0000 UTC m=+51.825154725" watchObservedRunningTime="2025-01-29 16:13:35.921244754 +0000 UTC m=+51.940562596" Jan 29 16:13:35.953952 containerd[1493]: time="2025-01-29T16:13:35.952687492Z" level=info msg="RemoveContainer for \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\" returns successfully" Jan 29 16:13:35.956653 kubelet[2663]: I0129 16:13:35.954541 2663 scope.go:117] "RemoveContainer" containerID="50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030" Jan 29 16:13:35.982332 containerd[1493]: time="2025-01-29T16:13:35.981952680Z" level=info msg="RemoveContainer for \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\"" Jan 29 16:13:35.989399 containerd[1493]: time="2025-01-29T16:13:35.988710024Z" level=info msg="shim disconnected" id=faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c namespace=k8s.io Jan 29 16:13:35.989399 containerd[1493]: time="2025-01-29T16:13:35.988803439Z" level=warning msg="cleaning up after shim disconnected" id=faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c namespace=k8s.io Jan 29 16:13:35.989399 containerd[1493]: time="2025-01-29T16:13:35.988832273Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:35.993392 containerd[1493]: time="2025-01-29T16:13:35.992243216Z" level=error msg="ExecSync for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" failed" error="failed to exec in container: failed to start exec \"e0aab7684118fa6dbe7def637b0ee8f267c6555e8d00f1a4318bbc586ac95ddf\": OCI runtime exec failed: exec failed: cannot exec in a stopped container: unknown" Jan 29 16:13:35.993392 containerd[1493]: time="2025-01-29T16:13:35.992432206Z" level=info msg="RemoveContainer for \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\" returns successfully" Jan 29 16:13:35.996155 containerd[1493]: time="2025-01-29T16:13:35.995145838Z" level=info msg="shim disconnected" id=ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400 namespace=k8s.io Jan 29 16:13:35.996155 containerd[1493]: time="2025-01-29T16:13:35.995214630Z" level=warning msg="cleaning up after shim disconnected" id=ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400 namespace=k8s.io Jan 29 16:13:35.996155 containerd[1493]: time="2025-01-29T16:13:35.995232562Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:35.996506 kubelet[2663]: I0129 16:13:35.995674 2663 scope.go:117] "RemoveContainer" containerID="3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395" Jan 29 16:13:36.002729 kubelet[2663]: E0129 16:13:36.002506 2663 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: failed to start exec \"e0aab7684118fa6dbe7def637b0ee8f267c6555e8d00f1a4318bbc586ac95ddf\": OCI runtime exec failed: exec failed: cannot exec in a stopped container: unknown" containerID="faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c" cmd=["/usr/bin/check-status","-r"] Jan 29 16:13:36.009073 kubelet[2663]: I0129 16:13:36.008788 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-c7c6f64f5-jnchp" podStartSLOduration=31.794662462 podStartE2EDuration="38.008761904s" podCreationTimestamp="2025-01-29 16:12:58 +0000 UTC" firstStartedPulling="2025-01-29 16:13:28.63434038 +0000 UTC m=+44.653658174" lastFinishedPulling="2025-01-29 16:13:34.848439822 +0000 UTC m=+50.867757616" observedRunningTime="2025-01-29 16:13:36.007504656 +0000 UTC m=+52.026822465" watchObservedRunningTime="2025-01-29 16:13:36.008761904 +0000 UTC m=+52.028079706" Jan 29 16:13:36.034277 containerd[1493]: time="2025-01-29T16:13:35.995990136Z" level=error msg="ContainerStatus for \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\": not found" Jan 29 16:13:36.034277 containerd[1493]: time="2025-01-29T16:13:36.003672216Z" level=error msg="ExecSync for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" failed" error="rpc error: code = NotFound desc = failed to exec in container: failed to load task: no running task found: task faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c not found: not found" Jan 29 16:13:36.035004 kubelet[2663]: E0129 16:13:36.034639 2663 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = failed to exec in container: failed to load task: no running task found: task faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c not found: not found" containerID="faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c" cmd=["/usr/bin/check-status","-r"] Jan 29 16:13:36.035004 kubelet[2663]: E0129 16:13:36.034822 2663 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\": not found" containerID="3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395" Jan 29 16:13:36.036465 kubelet[2663]: I0129 16:13:36.036397 2663 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395"} err="failed to get container status \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\": rpc error: code = NotFound desc = an error occurred when try to find container \"3df79159f8557ffc4ecce06666949cd833d60786e74bb1b95e4e969eb06b8395\": not found" Jan 29 16:13:36.036745 kubelet[2663]: I0129 16:13:36.036606 2663 scope.go:117] "RemoveContainer" containerID="33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7" Jan 29 16:13:36.040216 containerd[1493]: time="2025-01-29T16:13:36.040149546Z" level=error msg="ContainerStatus for \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\": not found" Jan 29 16:13:36.041522 kubelet[2663]: E0129 16:13:36.041322 2663 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\": not found" containerID="33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7" Jan 29 16:13:36.041522 kubelet[2663]: I0129 16:13:36.041387 2663 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7"} err="failed to get container status \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\": rpc error: code = NotFound desc = an error occurred when try to find container \"33aa22a123d93af6cdbd8fa2c64a6aba71f35b4af137dcb3e78f4b76723ed8a7\": not found" Jan 29 16:13:36.041522 kubelet[2663]: I0129 16:13:36.041414 2663 scope.go:117] "RemoveContainer" containerID="50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030" Jan 29 16:13:36.042388 containerd[1493]: time="2025-01-29T16:13:36.042005584Z" level=error msg="ContainerStatus for \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\": not found" Jan 29 16:13:36.042654 kubelet[2663]: E0129 16:13:36.042548 2663 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\": not found" containerID="50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030" Jan 29 16:13:36.042654 kubelet[2663]: I0129 16:13:36.042582 2663 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030"} err="failed to get container status \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\": rpc error: code = NotFound desc = an error occurred when try to find container \"50cf638e467bb5fa84d329b7d79cfe8d94a65f0579938868f6d2a6a21d9df030\": not found" Jan 29 16:13:36.044363 containerd[1493]: time="2025-01-29T16:13:36.043108210Z" level=error msg="ExecSync for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" failed" error="rpc error: code = NotFound desc = failed to exec in container: failed to load task: no running task found: task faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c not found: not found" Jan 29 16:13:36.044510 kubelet[2663]: E0129 16:13:36.043284 2663 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = failed to exec in container: failed to load task: no running task found: task faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c not found: not found" containerID="faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c" cmd=["/usr/bin/check-status","-r"] Jan 29 16:13:36.047217 containerd[1493]: time="2025-01-29T16:13:36.046663963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kv5nf,Uid:1097b7ed-9ec5-4d1d-9861-e504a1b3e4da,Namespace:calico-system,Attempt:0,}" Jan 29 16:13:36.052304 containerd[1493]: time="2025-01-29T16:13:36.052251322Z" level=warning msg="cleanup warnings time=\"2025-01-29T16:13:36Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 29 16:13:36.060039 containerd[1493]: time="2025-01-29T16:13:36.059994291Z" level=warning msg="cleanup warnings time=\"2025-01-29T16:13:36Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 29 16:13:36.062733 containerd[1493]: time="2025-01-29T16:13:36.062697888Z" level=info msg="StopContainer for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" returns successfully" Jan 29 16:13:36.069560 containerd[1493]: time="2025-01-29T16:13:36.069506506Z" level=info msg="StopContainer for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" returns successfully" Jan 29 16:13:36.070438 containerd[1493]: time="2025-01-29T16:13:36.070374905Z" level=info msg="StopPodSandbox for \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\"" Jan 29 16:13:36.070698 containerd[1493]: time="2025-01-29T16:13:36.070626315Z" level=info msg="Container to stop \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 29 16:13:36.071421 containerd[1493]: time="2025-01-29T16:13:36.071383162Z" level=info msg="StopPodSandbox for \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\"" Jan 29 16:13:36.071625 containerd[1493]: time="2025-01-29T16:13:36.071594308Z" level=info msg="Container to stop \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jan 29 16:13:36.096389 containerd[1493]: time="2025-01-29T16:13:36.095919852Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:36.096389 containerd[1493]: time="2025-01-29T16:13:36.096010861Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:36.096389 containerd[1493]: time="2025-01-29T16:13:36.096027155Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:36.096389 containerd[1493]: time="2025-01-29T16:13:36.096132646Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:36.098687 systemd[1]: cri-containerd-9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794.scope: Deactivated successfully. Jan 29 16:13:36.103960 systemd[1]: cri-containerd-d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca.scope: Deactivated successfully. Jan 29 16:13:36.173426 systemd[1]: Started cri-containerd-44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923.scope - libcontainer container 44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923. Jan 29 16:13:36.195854 kubelet[2663]: I0129 16:13:36.195602 2663 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac778ba4-a7a1-4547-bfec-fdc4b3b030c3" path="/var/lib/kubelet/pods/ac778ba4-a7a1-4547-bfec-fdc4b3b030c3/volumes" Jan 29 16:13:36.225286 containerd[1493]: time="2025-01-29T16:13:36.224243219Z" level=info msg="shim disconnected" id=9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794 namespace=k8s.io Jan 29 16:13:36.225286 containerd[1493]: time="2025-01-29T16:13:36.224670301Z" level=warning msg="cleaning up after shim disconnected" id=9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794 namespace=k8s.io Jan 29 16:13:36.225286 containerd[1493]: time="2025-01-29T16:13:36.224692870Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:36.231419 containerd[1493]: time="2025-01-29T16:13:36.231215729Z" level=info msg="shim disconnected" id=d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca namespace=k8s.io Jan 29 16:13:36.231419 containerd[1493]: time="2025-01-29T16:13:36.231285337Z" level=warning msg="cleaning up after shim disconnected" id=d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca namespace=k8s.io Jan 29 16:13:36.231419 containerd[1493]: time="2025-01-29T16:13:36.231302108Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:36.279857 containerd[1493]: time="2025-01-29T16:13:36.278307180Z" level=warning msg="cleanup warnings time=\"2025-01-29T16:13:36Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 29 16:13:36.281673 containerd[1493]: time="2025-01-29T16:13:36.281595087Z" level=info msg="TearDown network for sandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" successfully" Jan 29 16:13:36.281764 containerd[1493]: time="2025-01-29T16:13:36.281637180Z" level=info msg="StopPodSandbox for \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" returns successfully" Jan 29 16:13:36.336650 containerd[1493]: time="2025-01-29T16:13:36.336581729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kv5nf,Uid:1097b7ed-9ec5-4d1d-9861-e504a1b3e4da,Namespace:calico-system,Attempt:0,} returns sandbox id \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\"" Jan 29 16:13:36.345674 containerd[1493]: time="2025-01-29T16:13:36.344623121Z" level=info msg="CreateContainer within sandbox \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 29 16:13:36.385202 containerd[1493]: time="2025-01-29T16:13:36.383497300Z" level=info msg="CreateContainer within sandbox \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c\"" Jan 29 16:13:36.385202 containerd[1493]: time="2025-01-29T16:13:36.385019515Z" level=info msg="StartContainer for \"a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c\"" Jan 29 16:13:36.452661 kubelet[2663]: I0129 16:13:36.452476 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr2c2\" (UniqueName: \"kubernetes.io/projected/67d150fe-b678-4784-b548-afba79174d20-kube-api-access-pr2c2\") pod \"67d150fe-b678-4784-b548-afba79174d20\" (UID: \"67d150fe-b678-4784-b548-afba79174d20\") " Jan 29 16:13:36.452661 kubelet[2663]: I0129 16:13:36.452539 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d150fe-b678-4784-b548-afba79174d20-tigera-ca-bundle\") pod \"67d150fe-b678-4784-b548-afba79174d20\" (UID: \"67d150fe-b678-4784-b548-afba79174d20\") " Jan 29 16:13:36.452661 kubelet[2663]: I0129 16:13:36.452571 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/67d150fe-b678-4784-b548-afba79174d20-typha-certs\") pod \"67d150fe-b678-4784-b548-afba79174d20\" (UID: \"67d150fe-b678-4784-b548-afba79174d20\") " Jan 29 16:13:36.486299 kubelet[2663]: I0129 16:13:36.475878 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67d150fe-b678-4784-b548-afba79174d20-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "67d150fe-b678-4784-b548-afba79174d20" (UID: "67d150fe-b678-4784-b548-afba79174d20"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 29 16:13:36.486299 kubelet[2663]: I0129 16:13:36.478707 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67d150fe-b678-4784-b548-afba79174d20-kube-api-access-pr2c2" (OuterVolumeSpecName: "kube-api-access-pr2c2") pod "67d150fe-b678-4784-b548-afba79174d20" (UID: "67d150fe-b678-4784-b548-afba79174d20"). InnerVolumeSpecName "kube-api-access-pr2c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:13:36.491136 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c-rootfs.mount: Deactivated successfully. Jan 29 16:13:36.491510 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca-rootfs.mount: Deactivated successfully. Jan 29 16:13:36.491755 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca-shm.mount: Deactivated successfully. Jan 29 16:13:36.491966 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400-rootfs.mount: Deactivated successfully. Jan 29 16:13:36.492071 systemd[1]: var-lib-kubelet-pods-67d150fe\x2db678\x2d4784\x2db548\x2dafba79174d20-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Jan 29 16:13:36.492419 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794-rootfs.mount: Deactivated successfully. Jan 29 16:13:36.492660 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794-shm.mount: Deactivated successfully. Jan 29 16:13:36.492828 systemd[1]: var-lib-kubelet-pods-67d150fe\x2db678\x2d4784\x2db548\x2dafba79174d20-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dpr2c2.mount: Deactivated successfully. Jan 29 16:13:36.493066 systemd[1]: var-lib-kubelet-pods-67d150fe\x2db678\x2d4784\x2db548\x2dafba79174d20-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Jan 29 16:13:36.505397 systemd[1]: Started cri-containerd-a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c.scope - libcontainer container a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c. Jan 29 16:13:36.508375 kubelet[2663]: I0129 16:13:36.505932 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67d150fe-b678-4784-b548-afba79174d20-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "67d150fe-b678-4784-b548-afba79174d20" (UID: "67d150fe-b678-4784-b548-afba79174d20"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:13:36.555021 kubelet[2663]: I0129 16:13:36.554896 2663 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-pr2c2\" (UniqueName: \"kubernetes.io/projected/67d150fe-b678-4784-b548-afba79174d20-kube-api-access-pr2c2\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:36.555021 kubelet[2663]: I0129 16:13:36.554949 2663 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67d150fe-b678-4784-b548-afba79174d20-tigera-ca-bundle\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:36.555021 kubelet[2663]: I0129 16:13:36.554965 2663 reconciler_common.go:288] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/67d150fe-b678-4784-b548-afba79174d20-typha-certs\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:36.620430 systemd-networkd[1427]: calie19d29f925e: Link DOWN Jan 29 16:13:36.624276 systemd-networkd[1427]: calie19d29f925e: Lost carrier Jan 29 16:13:36.761546 containerd[1493]: time="2025-01-29T16:13:36.761442211Z" level=info msg="StartContainer for \"a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c\" returns successfully" Jan 29 16:13:36.803800 kubelet[2663]: I0129 16:13:36.803761 2663 scope.go:117] "RemoveContainer" containerID="ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400" Jan 29 16:13:36.826800 containerd[1493]: time="2025-01-29T16:13:36.826139774Z" level=info msg="RemoveContainer for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\"" Jan 29 16:13:36.841398 kubelet[2663]: I0129 16:13:36.841357 2663 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:36.844097 containerd[1493]: time="2025-01-29T16:13:36.844056152Z" level=info msg="RemoveContainer for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" returns successfully" Jan 29 16:13:36.844738 kubelet[2663]: I0129 16:13:36.844315 2663 scope.go:117] "RemoveContainer" containerID="ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400" Jan 29 16:13:36.847643 systemd[1]: Removed slice kubepods-besteffort-pod67d150fe_b678_4784_b548_afba79174d20.slice - libcontainer container kubepods-besteffort-pod67d150fe_b678_4784_b548_afba79174d20.slice. Jan 29 16:13:36.848636 kubelet[2663]: E0129 16:13:36.848161 2663 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\": not found" containerID="ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400" Jan 29 16:13:36.848636 kubelet[2663]: I0129 16:13:36.848568 2663 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400"} err="failed to get container status \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\": rpc error: code = NotFound desc = an error occurred when try to find container \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\": not found" Jan 29 16:13:36.848764 containerd[1493]: time="2025-01-29T16:13:36.847754045Z" level=error msg="ContainerStatus for \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"ffbabbbe4b42d539133ed726c98aef3e05397e2ea8ebef8d3974be3882abc400\": not found" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.616 [INFO][5198] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.617 [INFO][5198] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" iface="eth0" netns="/var/run/netns/cni-691f1b5c-975d-8e6e-7ad8-012a3e81af95" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.618 [INFO][5198] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" iface="eth0" netns="/var/run/netns/cni-691f1b5c-975d-8e6e-7ad8-012a3e81af95" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.629 [INFO][5198] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" after=11.419716ms iface="eth0" netns="/var/run/netns/cni-691f1b5c-975d-8e6e-7ad8-012a3e81af95" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.629 [INFO][5198] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.629 [INFO][5198] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.754 [INFO][5234] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.756 [INFO][5234] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.756 [INFO][5234] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.890 [INFO][5234] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.892 [INFO][5234] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.896 [INFO][5234] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:36.925941 containerd[1493]: 2025-01-29 16:13:36.907 [INFO][5198] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:36.929095 containerd[1493]: time="2025-01-29T16:13:36.928323497Z" level=info msg="TearDown network for sandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" successfully" Jan 29 16:13:36.929095 containerd[1493]: time="2025-01-29T16:13:36.928361454Z" level=info msg="StopPodSandbox for \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" returns successfully" Jan 29 16:13:36.927203 systemd[1]: run-netns-cni\x2d691f1b5c\x2d975d\x2d8e6e\x2d7ad8\x2d012a3e81af95.mount: Deactivated successfully. Jan 29 16:13:36.934406 containerd[1493]: time="2025-01-29T16:13:36.932992399Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.080 [WARNING][5266] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0", GenerateName:"calico-kube-controllers-85cfdd4458-", Namespace:"calico-system", SelfLink:"", UID:"d480db07-6802-4f6a-b925-8f4c938e9f7b", ResourceVersion:"982", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"85cfdd4458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca", Pod:"calico-kube-controllers-85cfdd4458-rcthl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie19d29f925e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.081 [INFO][5266] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.081 [INFO][5266] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" iface="eth0" netns="" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.081 [INFO][5266] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.082 [INFO][5266] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.141 [INFO][5272] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.141 [INFO][5272] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.141 [INFO][5272] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.157 [WARNING][5272] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.157 [INFO][5272] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.160 [INFO][5272] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:37.164938 containerd[1493]: 2025-01-29 16:13:37.162 [INFO][5266] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:37.170970 containerd[1493]: time="2025-01-29T16:13:37.168805148Z" level=info msg="TearDown network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" successfully" Jan 29 16:13:37.170970 containerd[1493]: time="2025-01-29T16:13:37.168858324Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" returns successfully" Jan 29 16:13:37.169421 systemd[1]: cri-containerd-a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c.scope: Deactivated successfully. Jan 29 16:13:37.259037 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c-rootfs.mount: Deactivated successfully. Jan 29 16:13:37.275984 kubelet[2663]: I0129 16:13:37.275333 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d480db07-6802-4f6a-b925-8f4c938e9f7b-tigera-ca-bundle\") pod \"d480db07-6802-4f6a-b925-8f4c938e9f7b\" (UID: \"d480db07-6802-4f6a-b925-8f4c938e9f7b\") " Jan 29 16:13:37.297409 kubelet[2663]: I0129 16:13:37.276311 2663 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9pmg\" (UniqueName: \"kubernetes.io/projected/d480db07-6802-4f6a-b925-8f4c938e9f7b-kube-api-access-j9pmg\") pod \"d480db07-6802-4f6a-b925-8f4c938e9f7b\" (UID: \"d480db07-6802-4f6a-b925-8f4c938e9f7b\") " Jan 29 16:13:37.288859 systemd[1]: var-lib-kubelet-pods-d480db07\x2d6802\x2d4f6a\x2db925\x2d8f4c938e9f7b-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Jan 29 16:13:37.301608 kubelet[2663]: I0129 16:13:37.301458 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d480db07-6802-4f6a-b925-8f4c938e9f7b-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "d480db07-6802-4f6a-b925-8f4c938e9f7b" (UID: "d480db07-6802-4f6a-b925-8f4c938e9f7b"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 29 16:13:37.311472 systemd[1]: var-lib-kubelet-pods-d480db07\x2d6802\x2d4f6a\x2db925\x2d8f4c938e9f7b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dj9pmg.mount: Deactivated successfully. Jan 29 16:13:37.314891 kubelet[2663]: I0129 16:13:37.314057 2663 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d480db07-6802-4f6a-b925-8f4c938e9f7b-kube-api-access-j9pmg" (OuterVolumeSpecName: "kube-api-access-j9pmg") pod "d480db07-6802-4f6a-b925-8f4c938e9f7b" (UID: "d480db07-6802-4f6a-b925-8f4c938e9f7b"). InnerVolumeSpecName "kube-api-access-j9pmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 29 16:13:37.315573 containerd[1493]: time="2025-01-29T16:13:37.315482929Z" level=info msg="shim disconnected" id=a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c namespace=k8s.io Jan 29 16:13:37.315843 containerd[1493]: time="2025-01-29T16:13:37.315764315Z" level=warning msg="cleaning up after shim disconnected" id=a943273a4caa015467eee7c5bf58e2c2978dc83668d642fabe0acd7e015c786c namespace=k8s.io Jan 29 16:13:37.315843 containerd[1493]: time="2025-01-29T16:13:37.315790680Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:37.350207 containerd[1493]: time="2025-01-29T16:13:37.350062552Z" level=warning msg="cleanup warnings time=\"2025-01-29T16:13:37Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 29 16:13:37.355880 containerd[1493]: time="2025-01-29T16:13:37.354714700Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:37.355880 containerd[1493]: time="2025-01-29T16:13:37.355843313Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 29 16:13:37.356557 containerd[1493]: time="2025-01-29T16:13:37.356495207Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:37.361810 containerd[1493]: time="2025-01-29T16:13:37.361778638Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:37.364237 containerd[1493]: time="2025-01-29T16:13:37.363541862Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 2.513680764s" Jan 29 16:13:37.364237 containerd[1493]: time="2025-01-29T16:13:37.363583885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 29 16:13:37.370774 containerd[1493]: time="2025-01-29T16:13:37.370738198Z" level=info msg="CreateContainer within sandbox \"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 29 16:13:37.376689 kubelet[2663]: I0129 16:13:37.376594 2663 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d480db07-6802-4f6a-b925-8f4c938e9f7b-tigera-ca-bundle\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:37.376689 kubelet[2663]: I0129 16:13:37.376631 2663 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-j9pmg\" (UniqueName: \"kubernetes.io/projected/d480db07-6802-4f6a-b925-8f4c938e9f7b-kube-api-access-j9pmg\") on node \"srv-6bdnt.gb1.brightbox.com\" DevicePath \"\"" Jan 29 16:13:37.387627 containerd[1493]: time="2025-01-29T16:13:37.387442625Z" level=info msg="CreateContainer within sandbox \"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8e9e7f77f946d4919b22d944d806b3042f042d4ea53acf9e1f3666a009fc2e41\"" Jan 29 16:13:37.388547 containerd[1493]: time="2025-01-29T16:13:37.388474498Z" level=info msg="StartContainer for \"8e9e7f77f946d4919b22d944d806b3042f042d4ea53acf9e1f3666a009fc2e41\"" Jan 29 16:13:37.437362 systemd[1]: Started cri-containerd-8e9e7f77f946d4919b22d944d806b3042f042d4ea53acf9e1f3666a009fc2e41.scope - libcontainer container 8e9e7f77f946d4919b22d944d806b3042f042d4ea53acf9e1f3666a009fc2e41. Jan 29 16:13:37.566748 containerd[1493]: time="2025-01-29T16:13:37.566571960Z" level=info msg="StartContainer for \"8e9e7f77f946d4919b22d944d806b3042f042d4ea53acf9e1f3666a009fc2e41\" returns successfully" Jan 29 16:13:37.570273 containerd[1493]: time="2025-01-29T16:13:37.570201799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 29 16:13:37.889145 containerd[1493]: time="2025-01-29T16:13:37.889067370Z" level=info msg="CreateContainer within sandbox \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 29 16:13:37.922903 systemd[1]: Removed slice kubepods-besteffort-podd480db07_6802_4f6a_b925_8f4c938e9f7b.slice - libcontainer container kubepods-besteffort-podd480db07_6802_4f6a_b925_8f4c938e9f7b.slice. Jan 29 16:13:37.926434 containerd[1493]: time="2025-01-29T16:13:37.925744412Z" level=info msg="CreateContainer within sandbox \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106\"" Jan 29 16:13:37.940528 containerd[1493]: time="2025-01-29T16:13:37.940469900Z" level=info msg="StartContainer for \"ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106\"" Jan 29 16:13:38.046419 systemd[1]: Started cri-containerd-ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106.scope - libcontainer container ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106. Jan 29 16:13:38.175510 kubelet[2663]: I0129 16:13:38.174453 2663 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67d150fe-b678-4784-b548-afba79174d20" path="/var/lib/kubelet/pods/67d150fe-b678-4784-b548-afba79174d20/volumes" Jan 29 16:13:38.177885 kubelet[2663]: I0129 16:13:38.177413 2663 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d480db07-6802-4f6a-b925-8f4c938e9f7b" path="/var/lib/kubelet/pods/d480db07-6802-4f6a-b925-8f4c938e9f7b/volumes" Jan 29 16:13:38.210410 containerd[1493]: time="2025-01-29T16:13:38.210321669Z" level=info msg="StartContainer for \"ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106\" returns successfully" Jan 29 16:13:38.439860 kubelet[2663]: E0129 16:13:38.439468 2663 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="67d150fe-b678-4784-b548-afba79174d20" containerName="calico-typha" Jan 29 16:13:38.439860 kubelet[2663]: E0129 16:13:38.439539 2663 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="d480db07-6802-4f6a-b925-8f4c938e9f7b" containerName="calico-kube-controllers" Jan 29 16:13:38.439860 kubelet[2663]: I0129 16:13:38.439609 2663 memory_manager.go:354] "RemoveStaleState removing state" podUID="67d150fe-b678-4784-b548-afba79174d20" containerName="calico-typha" Jan 29 16:13:38.439860 kubelet[2663]: I0129 16:13:38.439632 2663 memory_manager.go:354] "RemoveStaleState removing state" podUID="d480db07-6802-4f6a-b925-8f4c938e9f7b" containerName="calico-kube-controllers" Jan 29 16:13:38.464641 systemd[1]: Created slice kubepods-besteffort-pod1c9cf4d7_dc71_413f_a572_17aa050dc4eb.slice - libcontainer container kubepods-besteffort-pod1c9cf4d7_dc71_413f_a572_17aa050dc4eb.slice. Jan 29 16:13:38.481629 systemd[1]: run-containerd-runc-k8s.io-ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106-runc.8nYnSx.mount: Deactivated successfully. Jan 29 16:13:38.584960 kubelet[2663]: I0129 16:13:38.584903 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztm68\" (UniqueName: \"kubernetes.io/projected/1c9cf4d7-dc71-413f-a572-17aa050dc4eb-kube-api-access-ztm68\") pod \"calico-typha-65785558c6-swg7j\" (UID: \"1c9cf4d7-dc71-413f-a572-17aa050dc4eb\") " pod="calico-system/calico-typha-65785558c6-swg7j" Jan 29 16:13:38.585130 kubelet[2663]: I0129 16:13:38.584970 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1c9cf4d7-dc71-413f-a572-17aa050dc4eb-typha-certs\") pod \"calico-typha-65785558c6-swg7j\" (UID: \"1c9cf4d7-dc71-413f-a572-17aa050dc4eb\") " pod="calico-system/calico-typha-65785558c6-swg7j" Jan 29 16:13:38.585130 kubelet[2663]: I0129 16:13:38.585012 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c9cf4d7-dc71-413f-a572-17aa050dc4eb-tigera-ca-bundle\") pod \"calico-typha-65785558c6-swg7j\" (UID: \"1c9cf4d7-dc71-413f-a572-17aa050dc4eb\") " pod="calico-system/calico-typha-65785558c6-swg7j" Jan 29 16:13:38.771672 containerd[1493]: time="2025-01-29T16:13:38.771077836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65785558c6-swg7j,Uid:1c9cf4d7-dc71-413f-a572-17aa050dc4eb,Namespace:calico-system,Attempt:0,}" Jan 29 16:13:38.821893 containerd[1493]: time="2025-01-29T16:13:38.821346119Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:38.823624 containerd[1493]: time="2025-01-29T16:13:38.821868119Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:38.823624 containerd[1493]: time="2025-01-29T16:13:38.822973604Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:38.823624 containerd[1493]: time="2025-01-29T16:13:38.823100308Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:38.863416 systemd[1]: Started cri-containerd-523fbb2b13502c7ab2d289e3e95e2b040c7205aced943784bbf2c8300efc6b57.scope - libcontainer container 523fbb2b13502c7ab2d289e3e95e2b040c7205aced943784bbf2c8300efc6b57. Jan 29 16:13:39.086870 containerd[1493]: time="2025-01-29T16:13:39.085650778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65785558c6-swg7j,Uid:1c9cf4d7-dc71-413f-a572-17aa050dc4eb,Namespace:calico-system,Attempt:0,} returns sandbox id \"523fbb2b13502c7ab2d289e3e95e2b040c7205aced943784bbf2c8300efc6b57\"" Jan 29 16:13:39.113217 containerd[1493]: time="2025-01-29T16:13:39.112652386Z" level=info msg="CreateContainer within sandbox \"523fbb2b13502c7ab2d289e3e95e2b040c7205aced943784bbf2c8300efc6b57\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 29 16:13:39.141618 containerd[1493]: time="2025-01-29T16:13:39.141164449Z" level=info msg="CreateContainer within sandbox \"523fbb2b13502c7ab2d289e3e95e2b040c7205aced943784bbf2c8300efc6b57\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"896422bbd3d398f64954f5a38fd23fc56aa3267e9a55d5305e622e97b152edf9\"" Jan 29 16:13:39.142470 containerd[1493]: time="2025-01-29T16:13:39.142432450Z" level=info msg="StartContainer for \"896422bbd3d398f64954f5a38fd23fc56aa3267e9a55d5305e622e97b152edf9\"" Jan 29 16:13:39.212214 systemd[1]: Started cri-containerd-896422bbd3d398f64954f5a38fd23fc56aa3267e9a55d5305e622e97b152edf9.scope - libcontainer container 896422bbd3d398f64954f5a38fd23fc56aa3267e9a55d5305e622e97b152edf9. Jan 29 16:13:39.459063 containerd[1493]: time="2025-01-29T16:13:39.458893048Z" level=info msg="StartContainer for \"896422bbd3d398f64954f5a38fd23fc56aa3267e9a55d5305e622e97b152edf9\" returns successfully" Jan 29 16:13:40.957657 kubelet[2663]: I0129 16:13:40.957559 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-65785558c6-swg7j" podStartSLOduration=6.943889759 podStartE2EDuration="6.943889759s" podCreationTimestamp="2025-01-29 16:13:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:13:39.943879596 +0000 UTC m=+55.963197401" watchObservedRunningTime="2025-01-29 16:13:40.943889759 +0000 UTC m=+56.963207564" Jan 29 16:13:41.003565 containerd[1493]: time="2025-01-29T16:13:41.003376244Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:41.004619 containerd[1493]: time="2025-01-29T16:13:41.004474716Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 29 16:13:41.008655 containerd[1493]: time="2025-01-29T16:13:41.007052598Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:41.012580 containerd[1493]: time="2025-01-29T16:13:41.012529687Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 29 16:13:41.014137 containerd[1493]: time="2025-01-29T16:13:41.014081504Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 3.443828661s" Jan 29 16:13:41.014333 containerd[1493]: time="2025-01-29T16:13:41.014294683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 29 16:13:41.019856 containerd[1493]: time="2025-01-29T16:13:41.019820932Z" level=info msg="CreateContainer within sandbox \"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 29 16:13:41.076527 containerd[1493]: time="2025-01-29T16:13:41.076448379Z" level=info msg="CreateContainer within sandbox \"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"654a3e9d0fd0b1a1be29ac8bd14086e301ab94f213190e6d0fda2493aee4a828\"" Jan 29 16:13:41.077849 containerd[1493]: time="2025-01-29T16:13:41.077821717Z" level=info msg="StartContainer for \"654a3e9d0fd0b1a1be29ac8bd14086e301ab94f213190e6d0fda2493aee4a828\"" Jan 29 16:13:41.206398 systemd[1]: Started cri-containerd-654a3e9d0fd0b1a1be29ac8bd14086e301ab94f213190e6d0fda2493aee4a828.scope - libcontainer container 654a3e9d0fd0b1a1be29ac8bd14086e301ab94f213190e6d0fda2493aee4a828. Jan 29 16:13:41.323875 containerd[1493]: time="2025-01-29T16:13:41.322618739Z" level=info msg="StartContainer for \"654a3e9d0fd0b1a1be29ac8bd14086e301ab94f213190e6d0fda2493aee4a828\" returns successfully" Jan 29 16:13:41.955598 kubelet[2663]: I0129 16:13:41.955502 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-gd895" podStartSLOduration=34.514272852 podStartE2EDuration="44.955468514s" podCreationTimestamp="2025-01-29 16:12:57 +0000 UTC" firstStartedPulling="2025-01-29 16:13:30.575784161 +0000 UTC m=+46.595101957" lastFinishedPulling="2025-01-29 16:13:41.016979824 +0000 UTC m=+57.036297619" observedRunningTime="2025-01-29 16:13:41.954101747 +0000 UTC m=+57.973419668" watchObservedRunningTime="2025-01-29 16:13:41.955468514 +0000 UTC m=+57.974786320" Jan 29 16:13:42.134492 systemd[1]: cri-containerd-ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106.scope: Deactivated successfully. Jan 29 16:13:42.135391 systemd[1]: cri-containerd-ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106.scope: Consumed 1.302s CPU time. Jan 29 16:13:42.190676 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106-rootfs.mount: Deactivated successfully. Jan 29 16:13:42.221721 containerd[1493]: time="2025-01-29T16:13:42.221428513Z" level=info msg="shim disconnected" id=ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106 namespace=k8s.io Jan 29 16:13:42.221721 containerd[1493]: time="2025-01-29T16:13:42.221677954Z" level=warning msg="cleaning up after shim disconnected" id=ac9d0bca7f6cc9e007b0ac8a04b4ebd4028702993e12516ecde15466085b7106 namespace=k8s.io Jan 29 16:13:42.221721 containerd[1493]: time="2025-01-29T16:13:42.221694665Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 29 16:13:42.578885 kubelet[2663]: I0129 16:13:42.578628 2663 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 29 16:13:42.586956 kubelet[2663]: I0129 16:13:42.586822 2663 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 29 16:13:42.982269 containerd[1493]: time="2025-01-29T16:13:42.978493172Z" level=info msg="CreateContainer within sandbox \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 29 16:13:43.052430 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2865359954.mount: Deactivated successfully. Jan 29 16:13:43.072676 containerd[1493]: time="2025-01-29T16:13:43.072591450Z" level=info msg="CreateContainer within sandbox \"44184ff6951f8f1c17ef8a66b0d5584ae0924d714770387d85469c7ca705d923\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043\"" Jan 29 16:13:43.074220 containerd[1493]: time="2025-01-29T16:13:43.073299401Z" level=info msg="StartContainer for \"59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043\"" Jan 29 16:13:43.146432 systemd[1]: Started cri-containerd-59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043.scope - libcontainer container 59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043. Jan 29 16:13:43.250460 containerd[1493]: time="2025-01-29T16:13:43.250143451Z" level=info msg="StartContainer for \"59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043\" returns successfully" Jan 29 16:13:43.557938 systemd[1]: Created slice kubepods-besteffort-pod7cdfa0f5_75f3_4355_882a_cc675d837452.slice - libcontainer container kubepods-besteffort-pod7cdfa0f5_75f3_4355_882a_cc675d837452.slice. Jan 29 16:13:43.635997 kubelet[2663]: I0129 16:13:43.635946 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cdfa0f5-75f3-4355-882a-cc675d837452-tigera-ca-bundle\") pod \"calico-kube-controllers-6b89bbdd5d-tnztj\" (UID: \"7cdfa0f5-75f3-4355-882a-cc675d837452\") " pod="calico-system/calico-kube-controllers-6b89bbdd5d-tnztj" Jan 29 16:13:43.636874 kubelet[2663]: I0129 16:13:43.636013 2663 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62hd\" (UniqueName: \"kubernetes.io/projected/7cdfa0f5-75f3-4355-882a-cc675d837452-kube-api-access-j62hd\") pod \"calico-kube-controllers-6b89bbdd5d-tnztj\" (UID: \"7cdfa0f5-75f3-4355-882a-cc675d837452\") " pod="calico-system/calico-kube-controllers-6b89bbdd5d-tnztj" Jan 29 16:13:43.876769 containerd[1493]: time="2025-01-29T16:13:43.876588161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b89bbdd5d-tnztj,Uid:7cdfa0f5-75f3-4355-882a-cc675d837452,Namespace:calico-system,Attempt:0,}" Jan 29 16:13:43.985799 kubelet[2663]: I0129 16:13:43.985507 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-kv5nf" podStartSLOduration=8.985486417 podStartE2EDuration="8.985486417s" podCreationTimestamp="2025-01-29 16:13:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:13:43.984957486 +0000 UTC m=+60.004275298" watchObservedRunningTime="2025-01-29 16:13:43.985486417 +0000 UTC m=+60.004804216" Jan 29 16:13:44.220347 systemd-networkd[1427]: cali1bb2cef749c: Link UP Jan 29 16:13:44.223025 systemd-networkd[1427]: cali1bb2cef749c: Gained carrier Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.060 [INFO][5601] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0 calico-kube-controllers-6b89bbdd5d- calico-system 7cdfa0f5-75f3-4355-882a-cc675d837452 1071 0 2025-01-29 16:13:38 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b89bbdd5d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-6bdnt.gb1.brightbox.com calico-kube-controllers-6b89bbdd5d-tnztj eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1bb2cef749c [] []}} ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.061 [INFO][5601] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.127 [INFO][5637] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" HandleID="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.141 [INFO][5637] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" HandleID="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319760), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-6bdnt.gb1.brightbox.com", "pod":"calico-kube-controllers-6b89bbdd5d-tnztj", "timestamp":"2025-01-29 16:13:44.127194602 +0000 UTC"}, Hostname:"srv-6bdnt.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.141 [INFO][5637] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.142 [INFO][5637] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.142 [INFO][5637] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-6bdnt.gb1.brightbox.com' Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.146 [INFO][5637] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.153 [INFO][5637] ipam/ipam.go 372: Looking up existing affinities for host host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.163 [INFO][5637] ipam/ipam.go 489: Trying affinity for 192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.172 [INFO][5637] ipam/ipam.go 155: Attempting to load block cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.178 [INFO][5637] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.17.128/26 host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.178 [INFO][5637] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.17.128/26 handle="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.181 [INFO][5637] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.189 [INFO][5637] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.17.128/26 handle="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.198 [INFO][5637] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.17.135/26] block=192.168.17.128/26 handle="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.198 [INFO][5637] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.17.135/26] handle="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" host="srv-6bdnt.gb1.brightbox.com" Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.198 [INFO][5637] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:44.271067 containerd[1493]: 2025-01-29 16:13:44.198 [INFO][5637] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.17.135/26] IPv6=[] ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" HandleID="k8s-pod-network.e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.277305 containerd[1493]: 2025-01-29 16:13:44.205 [INFO][5601] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0", GenerateName:"calico-kube-controllers-6b89bbdd5d-", Namespace:"calico-system", SelfLink:"", UID:"7cdfa0f5-75f3-4355-882a-cc675d837452", ResourceVersion:"1071", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 13, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b89bbdd5d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-6b89bbdd5d-tnztj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.17.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1bb2cef749c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:44.277305 containerd[1493]: 2025-01-29 16:13:44.205 [INFO][5601] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.17.135/32] ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.277305 containerd[1493]: 2025-01-29 16:13:44.206 [INFO][5601] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1bb2cef749c ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.277305 containerd[1493]: 2025-01-29 16:13:44.224 [INFO][5601] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.277305 containerd[1493]: 2025-01-29 16:13:44.226 [INFO][5601] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0", GenerateName:"calico-kube-controllers-6b89bbdd5d-", Namespace:"calico-system", SelfLink:"", UID:"7cdfa0f5-75f3-4355-882a-cc675d837452", ResourceVersion:"1071", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 13, 38, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b89bbdd5d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf", Pod:"calico-kube-controllers-6b89bbdd5d-tnztj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.17.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1bb2cef749c", MAC:"92:f6:8a:43:b0:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:44.277305 containerd[1493]: 2025-01-29 16:13:44.258 [INFO][5601] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf" Namespace="calico-system" Pod="calico-kube-controllers-6b89bbdd5d-tnztj" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--6b89bbdd5d--tnztj-eth0" Jan 29 16:13:44.375626 kubelet[2663]: I0129 16:13:44.373624 2663 scope.go:117] "RemoveContainer" containerID="faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c" Jan 29 16:13:44.399545 containerd[1493]: time="2025-01-29T16:13:44.399044002Z" level=info msg="RemoveContainer for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\"" Jan 29 16:13:44.419420 containerd[1493]: time="2025-01-29T16:13:44.419375890Z" level=info msg="RemoveContainer for \"faf951eaaf790f26ece22ce838d0279b72e79ea0a1625a1af242444d1379833c\" returns successfully" Jan 29 16:13:44.422109 containerd[1493]: time="2025-01-29T16:13:44.421262593Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 29 16:13:44.422109 containerd[1493]: time="2025-01-29T16:13:44.421938200Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 29 16:13:44.422109 containerd[1493]: time="2025-01-29T16:13:44.421965121Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:44.424432 containerd[1493]: time="2025-01-29T16:13:44.423774473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 29 16:13:44.426217 containerd[1493]: time="2025-01-29T16:13:44.425590379Z" level=info msg="StopPodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\"" Jan 29 16:13:44.466386 systemd[1]: Started cri-containerd-e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf.scope - libcontainer container e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf. Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.515 [WARNING][5701] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e9d2429c-47e2-48a0-bc35-409f18438229", ResourceVersion:"1053", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396", Pod:"csi-node-driver-gd895", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.17.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicca68b5c8d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.516 [INFO][5701] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.516 [INFO][5701] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" iface="eth0" netns="" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.516 [INFO][5701] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.516 [INFO][5701] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.556 [INFO][5714] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.556 [INFO][5714] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.556 [INFO][5714] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.572 [WARNING][5714] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.572 [INFO][5714] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.575 [INFO][5714] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:44.583526 containerd[1493]: 2025-01-29 16:13:44.580 [INFO][5701] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.583526 containerd[1493]: time="2025-01-29T16:13:44.582934932Z" level=info msg="TearDown network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" successfully" Jan 29 16:13:44.583526 containerd[1493]: time="2025-01-29T16:13:44.582974358Z" level=info msg="StopPodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" returns successfully" Jan 29 16:13:44.594354 containerd[1493]: time="2025-01-29T16:13:44.592098284Z" level=info msg="RemovePodSandbox for \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\"" Jan 29 16:13:44.597639 containerd[1493]: time="2025-01-29T16:13:44.597594293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b89bbdd5d-tnztj,Uid:7cdfa0f5-75f3-4355-882a-cc675d837452,Namespace:calico-system,Attempt:0,} returns sandbox id \"e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf\"" Jan 29 16:13:44.603261 containerd[1493]: time="2025-01-29T16:13:44.603223790Z" level=info msg="Forcibly stopping sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\"" Jan 29 16:13:44.619589 containerd[1493]: time="2025-01-29T16:13:44.619546595Z" level=info msg="CreateContainer within sandbox \"e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 29 16:13:44.653642 containerd[1493]: time="2025-01-29T16:13:44.653579874Z" level=info msg="CreateContainer within sandbox \"e06b2edcad412b5d62a8bb5a644ca08cf3ef75f3b65f1757bdcea786b059b2cf\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34\"" Jan 29 16:13:44.655044 containerd[1493]: time="2025-01-29T16:13:44.655013237Z" level=info msg="StartContainer for \"c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34\"" Jan 29 16:13:44.719875 systemd[1]: Started cri-containerd-c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34.scope - libcontainer container c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34. Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.687 [WARNING][5741] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e9d2429c-47e2-48a0-bc35-409f18438229", ResourceVersion:"1053", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"b4223e715c8cf720436f89fedff2d89ecec1f636925f0d0441b1b4c4418db396", Pod:"csi-node-driver-gd895", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.17.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicca68b5c8d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.688 [INFO][5741] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.688 [INFO][5741] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" iface="eth0" netns="" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.688 [INFO][5741] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.688 [INFO][5741] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.727 [INFO][5756] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.727 [INFO][5756] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.727 [INFO][5756] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.738 [WARNING][5756] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.738 [INFO][5756] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" HandleID="k8s-pod-network.3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Workload="srv--6bdnt.gb1.brightbox.com-k8s-csi--node--driver--gd895-eth0" Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.740 [INFO][5756] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:44.746345 containerd[1493]: 2025-01-29 16:13:44.744 [INFO][5741] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787" Jan 29 16:13:44.749121 containerd[1493]: time="2025-01-29T16:13:44.746410551Z" level=info msg="TearDown network for sandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" successfully" Jan 29 16:13:44.768385 containerd[1493]: time="2025-01-29T16:13:44.768043934Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:44.768385 containerd[1493]: time="2025-01-29T16:13:44.768219144Z" level=info msg="RemovePodSandbox \"3b7b3b13abc58e0ef5d7b861dec4b80320f7e502a217b12b760542751eb7c787\" returns successfully" Jan 29 16:13:44.774520 containerd[1493]: time="2025-01-29T16:13:44.774459589Z" level=info msg="StopPodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\"" Jan 29 16:13:44.918005 containerd[1493]: time="2025-01-29T16:13:44.917870449Z" level=info msg="StartContainer for \"c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34\" returns successfully" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:44.865 [WARNING][5804] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"7c206eb9-9889-4da7-833e-4f8997709e6e", ResourceVersion:"1027", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811", Pod:"calico-apiserver-c7c6f64f5-jnchp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe150f923e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:44.868 [INFO][5804] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:44.868 [INFO][5804] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" iface="eth0" netns="" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:44.868 [INFO][5804] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:44.869 [INFO][5804] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.029 [INFO][5828] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.034 [INFO][5828] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.034 [INFO][5828] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.058 [WARNING][5828] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.058 [INFO][5828] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.061 [INFO][5828] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:45.091706 containerd[1493]: 2025-01-29 16:13:45.072 [INFO][5804] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.105344 containerd[1493]: time="2025-01-29T16:13:45.091978602Z" level=info msg="TearDown network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" successfully" Jan 29 16:13:45.105344 containerd[1493]: time="2025-01-29T16:13:45.092027488Z" level=info msg="StopPodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" returns successfully" Jan 29 16:13:45.105344 containerd[1493]: time="2025-01-29T16:13:45.094372279Z" level=info msg="RemovePodSandbox for \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\"" Jan 29 16:13:45.105344 containerd[1493]: time="2025-01-29T16:13:45.094430409Z" level=info msg="Forcibly stopping sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\"" Jan 29 16:13:45.437310 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount598263680.mount: Deactivated successfully. Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.457 [WARNING][5936] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"7c206eb9-9889-4da7-833e-4f8997709e6e", ResourceVersion:"1027", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"3875fb97dc3f7cefa44ff9739d661bc6c8b03aedc9556be937d3e7de7994d811", Pod:"calico-apiserver-c7c6f64f5-jnchp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe150f923e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.458 [INFO][5936] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.458 [INFO][5936] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" iface="eth0" netns="" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.458 [INFO][5936] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.458 [INFO][5936] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.519 [INFO][5960] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.519 [INFO][5960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.519 [INFO][5960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.533 [WARNING][5960] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.533 [INFO][5960] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" HandleID="k8s-pod-network.b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--jnchp-eth0" Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.536 [INFO][5960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:45.564481 containerd[1493]: 2025-01-29 16:13:45.556 [INFO][5936] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b" Jan 29 16:13:45.570035 containerd[1493]: time="2025-01-29T16:13:45.565152265Z" level=info msg="TearDown network for sandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" successfully" Jan 29 16:13:45.582210 containerd[1493]: time="2025-01-29T16:13:45.582132173Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:45.582461 containerd[1493]: time="2025-01-29T16:13:45.582430410Z" level=info msg="RemovePodSandbox \"b5da80ecafa06380a54423682de883c91a24fecc2de12e7d0ac9efb7804e388b\" returns successfully" Jan 29 16:13:45.612310 systemd-networkd[1427]: cali1bb2cef749c: Gained IPv6LL Jan 29 16:13:45.756750 containerd[1493]: time="2025-01-29T16:13:45.755834296Z" level=info msg="StopPodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\"" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.906 [WARNING][5988] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ae120f8-d2d4-415d-b245-151e25c130c4", ResourceVersion:"887", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9", Pod:"calico-apiserver-c7c6f64f5-hshkw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68ee9c7f433", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.907 [INFO][5988] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.907 [INFO][5988] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" iface="eth0" netns="" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.907 [INFO][5988] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.907 [INFO][5988] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.992 [INFO][5995] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.994 [INFO][5995] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:45.994 [INFO][5995] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:46.011 [WARNING][5995] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:46.011 [INFO][5995] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:46.016 [INFO][5995] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:46.030640 containerd[1493]: 2025-01-29 16:13:46.025 [INFO][5988] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.063328 containerd[1493]: time="2025-01-29T16:13:46.029833108Z" level=info msg="TearDown network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" successfully" Jan 29 16:13:46.063328 containerd[1493]: time="2025-01-29T16:13:46.063323529Z" level=info msg="StopPodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" returns successfully" Jan 29 16:13:46.083101 containerd[1493]: time="2025-01-29T16:13:46.083058131Z" level=info msg="RemovePodSandbox for \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\"" Jan 29 16:13:46.084491 containerd[1493]: time="2025-01-29T16:13:46.083308818Z" level=info msg="Forcibly stopping sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\"" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.293 [WARNING][6028] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0", GenerateName:"calico-apiserver-c7c6f64f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"8ae120f8-d2d4-415d-b245-151e25c130c4", ResourceVersion:"887", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"c7c6f64f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"f1b02feefb88c29f708d0602722d740291ad2114e22d3a174030527d19485dd9", Pod:"calico-apiserver-c7c6f64f5-hshkw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.17.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68ee9c7f433", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.294 [INFO][6028] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.294 [INFO][6028] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" iface="eth0" netns="" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.294 [INFO][6028] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.294 [INFO][6028] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.365 [INFO][6062] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.366 [INFO][6062] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.368 [INFO][6062] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.377 [WARNING][6062] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.378 [INFO][6062] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" HandleID="k8s-pod-network.31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--apiserver--c7c6f64f5--hshkw-eth0" Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.381 [INFO][6062] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:46.387311 containerd[1493]: 2025-01-29 16:13:46.383 [INFO][6028] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e" Jan 29 16:13:46.388942 containerd[1493]: time="2025-01-29T16:13:46.388148587Z" level=info msg="TearDown network for sandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" successfully" Jan 29 16:13:46.395784 containerd[1493]: time="2025-01-29T16:13:46.395711145Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:46.396141 containerd[1493]: time="2025-01-29T16:13:46.395934373Z" level=info msg="RemovePodSandbox \"31ba11d38b06f94d66f92b291d29cae8616dc1808c034fa008f83721e550df5e\" returns successfully" Jan 29 16:13:46.398107 containerd[1493]: time="2025-01-29T16:13:46.398072013Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.461 [WARNING][6091] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.461 [INFO][6091] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.461 [INFO][6091] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" iface="eth0" netns="" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.461 [INFO][6091] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.461 [INFO][6091] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.494 [INFO][6097] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.494 [INFO][6097] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.494 [INFO][6097] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.505 [WARNING][6097] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.505 [INFO][6097] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.507 [INFO][6097] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:46.512296 containerd[1493]: 2025-01-29 16:13:46.509 [INFO][6091] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.515953 containerd[1493]: time="2025-01-29T16:13:46.512340156Z" level=info msg="TearDown network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" successfully" Jan 29 16:13:46.515953 containerd[1493]: time="2025-01-29T16:13:46.512374974Z" level=info msg="StopPodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" returns successfully" Jan 29 16:13:46.515953 containerd[1493]: time="2025-01-29T16:13:46.513402771Z" level=info msg="RemovePodSandbox for \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" Jan 29 16:13:46.515953 containerd[1493]: time="2025-01-29T16:13:46.513437451Z" level=info msg="Forcibly stopping sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\"" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.573 [WARNING][6115] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.573 [INFO][6115] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.573 [INFO][6115] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" iface="eth0" netns="" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.573 [INFO][6115] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.573 [INFO][6115] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.608 [INFO][6121] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.609 [INFO][6121] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.609 [INFO][6121] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.632 [WARNING][6121] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.633 [INFO][6121] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" HandleID="k8s-pod-network.1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.637 [INFO][6121] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:46.643485 containerd[1493]: 2025-01-29 16:13:46.639 [INFO][6115] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6" Jan 29 16:13:46.643485 containerd[1493]: time="2025-01-29T16:13:46.643379810Z" level=info msg="TearDown network for sandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" successfully" Jan 29 16:13:46.657646 containerd[1493]: time="2025-01-29T16:13:46.657564223Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:46.657886 containerd[1493]: time="2025-01-29T16:13:46.657676788Z" level=info msg="RemovePodSandbox \"1b4c292674d54f93d804ccbabeba6b9b6d85569032c935f8d74191a0a0fbc9c6\" returns successfully" Jan 29 16:13:46.658892 containerd[1493]: time="2025-01-29T16:13:46.658463885Z" level=info msg="StopPodSandbox for \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\"" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.717 [WARNING][6139] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.718 [INFO][6139] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.718 [INFO][6139] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" iface="eth0" netns="" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.718 [INFO][6139] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.718 [INFO][6139] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.761 [INFO][6146] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.761 [INFO][6146] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.761 [INFO][6146] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.771 [WARNING][6146] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.771 [INFO][6146] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.774 [INFO][6146] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:46.778698 containerd[1493]: 2025-01-29 16:13:46.777 [INFO][6139] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.781692 containerd[1493]: time="2025-01-29T16:13:46.779872400Z" level=info msg="TearDown network for sandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" successfully" Jan 29 16:13:46.781692 containerd[1493]: time="2025-01-29T16:13:46.779930659Z" level=info msg="StopPodSandbox for \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" returns successfully" Jan 29 16:13:46.781692 containerd[1493]: time="2025-01-29T16:13:46.780762696Z" level=info msg="RemovePodSandbox for \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\"" Jan 29 16:13:46.781692 containerd[1493]: time="2025-01-29T16:13:46.780812120Z" level=info msg="Forcibly stopping sandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\"" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.863 [WARNING][6175] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" WorkloadEndpoint="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.863 [INFO][6175] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.863 [INFO][6175] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" iface="eth0" netns="" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.863 [INFO][6175] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.864 [INFO][6175] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.913 [INFO][6184] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.913 [INFO][6184] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.913 [INFO][6184] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.922 [WARNING][6184] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.922 [INFO][6184] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" HandleID="k8s-pod-network.d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Workload="srv--6bdnt.gb1.brightbox.com-k8s-calico--kube--controllers--85cfdd4458--rcthl-eth0" Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.926 [INFO][6184] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:46.930616 containerd[1493]: 2025-01-29 16:13:46.928 [INFO][6175] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca" Jan 29 16:13:46.932559 containerd[1493]: time="2025-01-29T16:13:46.930943254Z" level=info msg="TearDown network for sandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" successfully" Jan 29 16:13:46.936628 containerd[1493]: time="2025-01-29T16:13:46.936581664Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:46.936716 containerd[1493]: time="2025-01-29T16:13:46.936673515Z" level=info msg="RemovePodSandbox \"d3dec02cfce4ad016aa627cd87cb3cb560c650d86aa14325ef00b057208dc5ca\" returns successfully" Jan 29 16:13:46.937458 containerd[1493]: time="2025-01-29T16:13:46.937390185Z" level=info msg="StopPodSandbox for \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\"" Jan 29 16:13:46.937546 containerd[1493]: time="2025-01-29T16:13:46.937515107Z" level=info msg="TearDown network for sandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" successfully" Jan 29 16:13:46.937612 containerd[1493]: time="2025-01-29T16:13:46.937548424Z" level=info msg="StopPodSandbox for \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" returns successfully" Jan 29 16:13:46.939769 containerd[1493]: time="2025-01-29T16:13:46.939634132Z" level=info msg="RemovePodSandbox for \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\"" Jan 29 16:13:46.939769 containerd[1493]: time="2025-01-29T16:13:46.939679045Z" level=info msg="Forcibly stopping sandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\"" Jan 29 16:13:46.940254 containerd[1493]: time="2025-01-29T16:13:46.939905122Z" level=info msg="TearDown network for sandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" successfully" Jan 29 16:13:46.952949 containerd[1493]: time="2025-01-29T16:13:46.951818759Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:46.952949 containerd[1493]: time="2025-01-29T16:13:46.952755516Z" level=info msg="RemovePodSandbox \"9686c308a979b052e3843bb4e93687e55286a909b25c716f37541dfcf8c15794\" returns successfully" Jan 29 16:13:46.955047 containerd[1493]: time="2025-01-29T16:13:46.954668126Z" level=info msg="StopPodSandbox for \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\"" Jan 29 16:13:46.955047 containerd[1493]: time="2025-01-29T16:13:46.954760956Z" level=info msg="TearDown network for sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" successfully" Jan 29 16:13:46.955047 containerd[1493]: time="2025-01-29T16:13:46.954781046Z" level=info msg="StopPodSandbox for \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" returns successfully" Jan 29 16:13:46.957791 containerd[1493]: time="2025-01-29T16:13:46.956429461Z" level=info msg="RemovePodSandbox for \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\"" Jan 29 16:13:46.957791 containerd[1493]: time="2025-01-29T16:13:46.956470225Z" level=info msg="Forcibly stopping sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\"" Jan 29 16:13:46.976660 containerd[1493]: time="2025-01-29T16:13:46.976611845Z" level=info msg="TearDown network for sandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" successfully" Jan 29 16:13:46.992048 containerd[1493]: time="2025-01-29T16:13:46.991122578Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:46.992048 containerd[1493]: time="2025-01-29T16:13:46.991312113Z" level=info msg="RemovePodSandbox \"4d05b29b21bdf47393c1859132451e16f74143a928c420548e3d8390e6f32409\" returns successfully" Jan 29 16:13:46.992048 containerd[1493]: time="2025-01-29T16:13:46.991969277Z" level=info msg="StopPodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\"" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.119 [WARNING][6222] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"424af8c2-52c0-4e44-8235-7c2d50b2f3f1", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38", Pod:"coredns-6f6b679f8f-9s2kl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2d8bb57669c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.119 [INFO][6222] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.119 [INFO][6222] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" iface="eth0" netns="" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.119 [INFO][6222] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.119 [INFO][6222] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.176 [INFO][6238] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.176 [INFO][6238] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.177 [INFO][6238] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.188 [WARNING][6238] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.188 [INFO][6238] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.194 [INFO][6238] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:47.201258 containerd[1493]: 2025-01-29 16:13:47.197 [INFO][6222] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.201258 containerd[1493]: time="2025-01-29T16:13:47.201151637Z" level=info msg="TearDown network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" successfully" Jan 29 16:13:47.201258 containerd[1493]: time="2025-01-29T16:13:47.201228800Z" level=info msg="StopPodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" returns successfully" Jan 29 16:13:47.208471 containerd[1493]: time="2025-01-29T16:13:47.208428683Z" level=info msg="RemovePodSandbox for \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\"" Jan 29 16:13:47.208636 containerd[1493]: time="2025-01-29T16:13:47.208609345Z" level=info msg="Forcibly stopping sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\"" Jan 29 16:13:47.321324 systemd[1]: run-containerd-runc-k8s.io-c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34-runc.IzuGpT.mount: Deactivated successfully. Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.319 [WARNING][6256] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"424af8c2-52c0-4e44-8235-7c2d50b2f3f1", ResourceVersion:"871", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"628ff109569652c935ebed2c7842116ca8020c598960b4856fb20306e66d9e38", Pod:"coredns-6f6b679f8f-9s2kl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2d8bb57669c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.320 [INFO][6256] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.320 [INFO][6256] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" iface="eth0" netns="" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.320 [INFO][6256] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.321 [INFO][6256] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.391 [INFO][6275] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.391 [INFO][6275] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.391 [INFO][6275] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.407 [WARNING][6275] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.407 [INFO][6275] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" HandleID="k8s-pod-network.9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--9s2kl-eth0" Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.412 [INFO][6275] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:47.418488 containerd[1493]: 2025-01-29 16:13:47.415 [INFO][6256] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede" Jan 29 16:13:47.419935 containerd[1493]: time="2025-01-29T16:13:47.419118381Z" level=info msg="TearDown network for sandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" successfully" Jan 29 16:13:47.429673 containerd[1493]: time="2025-01-29T16:13:47.428150548Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:47.429673 containerd[1493]: time="2025-01-29T16:13:47.428257223Z" level=info msg="RemovePodSandbox \"9680552f2ac0b7df2100211124bd6d257a27297eb6b08b60bc930d230740eede\" returns successfully" Jan 29 16:13:47.430195 containerd[1493]: time="2025-01-29T16:13:47.430122664Z" level=info msg="StopPodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\"" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.538 [WARNING][6315] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"ba3c4db0-9089-4994-8c25-05651ef8025d", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867", Pod:"coredns-6f6b679f8f-hvt88", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali01fc5829f8c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.538 [INFO][6315] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.538 [INFO][6315] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" iface="eth0" netns="" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.538 [INFO][6315] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.538 [INFO][6315] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.573 [INFO][6338] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.573 [INFO][6338] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.573 [INFO][6338] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.583 [WARNING][6338] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.583 [INFO][6338] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.586 [INFO][6338] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:47.590148 containerd[1493]: 2025-01-29 16:13:47.588 [INFO][6315] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.591784 containerd[1493]: time="2025-01-29T16:13:47.590266358Z" level=info msg="TearDown network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" successfully" Jan 29 16:13:47.591784 containerd[1493]: time="2025-01-29T16:13:47.590302347Z" level=info msg="StopPodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" returns successfully" Jan 29 16:13:47.591784 containerd[1493]: time="2025-01-29T16:13:47.591518850Z" level=info msg="RemovePodSandbox for \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\"" Jan 29 16:13:47.591784 containerd[1493]: time="2025-01-29T16:13:47.591557345Z" level=info msg="Forcibly stopping sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\"" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.658 [WARNING][6356] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"ba3c4db0-9089-4994-8c25-05651ef8025d", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.January, 29, 16, 12, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-6bdnt.gb1.brightbox.com", ContainerID:"cc431ba9800a54e35aa97491105cadd0ab353a550b4ed3ad8a6179ef51525867", Pod:"coredns-6f6b679f8f-hvt88", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.17.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali01fc5829f8c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.658 [INFO][6356] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.658 [INFO][6356] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" iface="eth0" netns="" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.658 [INFO][6356] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.658 [INFO][6356] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.709 [INFO][6362] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.710 [INFO][6362] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.710 [INFO][6362] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.725 [WARNING][6362] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.725 [INFO][6362] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" HandleID="k8s-pod-network.d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Workload="srv--6bdnt.gb1.brightbox.com-k8s-coredns--6f6b679f8f--hvt88-eth0" Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.727 [INFO][6362] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 29 16:13:47.733280 containerd[1493]: 2025-01-29 16:13:47.730 [INFO][6356] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2" Jan 29 16:13:47.736092 containerd[1493]: time="2025-01-29T16:13:47.733331869Z" level=info msg="TearDown network for sandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" successfully" Jan 29 16:13:47.738774 containerd[1493]: time="2025-01-29T16:13:47.738391254Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 29 16:13:47.738774 containerd[1493]: time="2025-01-29T16:13:47.738470489Z" level=info msg="RemovePodSandbox \"d494d6ec67ed09df8063aa049b0424385d6f436ac76cc33f1e5c877312463ff2\" returns successfully" Jan 29 16:13:56.818270 kubelet[2663]: I0129 16:13:56.818187 2663 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 29 16:13:56.883282 kubelet[2663]: I0129 16:13:56.869852 2663 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6b89bbdd5d-tnztj" podStartSLOduration=18.866793158 podStartE2EDuration="18.866793158s" podCreationTimestamp="2025-01-29 16:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-29 16:13:45.004100784 +0000 UTC m=+61.023418600" watchObservedRunningTime="2025-01-29 16:13:56.866793158 +0000 UTC m=+72.886110963" Jan 29 16:14:02.001783 systemd[1]: run-containerd-runc-k8s.io-c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34-runc.7CjYpW.mount: Deactivated successfully. Jan 29 16:14:06.093682 systemd[1]: run-containerd-runc-k8s.io-59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043-runc.vcIbpl.mount: Deactivated successfully. Jan 29 16:14:13.960478 systemd[1]: run-containerd-runc-k8s.io-c543c49bb84b1c4d08b95b96823558177225066c2cc9c520215c11f6ac039c34-runc.vbErZh.mount: Deactivated successfully. Jan 29 16:14:15.829941 systemd[1]: Started sshd@9-10.230.37.146:22-139.178.68.195:50182.service - OpenSSH per-connection server daemon (139.178.68.195:50182). Jan 29 16:14:16.797412 sshd[6468]: Accepted publickey for core from 139.178.68.195 port 50182 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:16.801509 sshd[6468]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:16.816245 systemd-logind[1482]: New session 12 of user core. Jan 29 16:14:16.824428 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 29 16:14:18.025240 sshd[6468]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:18.032036 systemd[1]: sshd@9-10.230.37.146:22-139.178.68.195:50182.service: Deactivated successfully. Jan 29 16:14:18.034708 systemd[1]: session-12.scope: Deactivated successfully. Jan 29 16:14:18.035859 systemd-logind[1482]: Session 12 logged out. Waiting for processes to exit. Jan 29 16:14:18.038030 systemd-logind[1482]: Removed session 12. Jan 29 16:14:23.187469 systemd[1]: Started sshd@10-10.230.37.146:22-139.178.68.195:50186.service - OpenSSH per-connection server daemon (139.178.68.195:50186). Jan 29 16:14:24.102104 sshd[6484]: Accepted publickey for core from 139.178.68.195 port 50186 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:24.106362 sshd[6484]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:24.116618 systemd-logind[1482]: New session 13 of user core. Jan 29 16:14:24.126423 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 29 16:14:24.835850 sshd[6484]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:24.842235 systemd-logind[1482]: Session 13 logged out. Waiting for processes to exit. Jan 29 16:14:24.843499 systemd[1]: sshd@10-10.230.37.146:22-139.178.68.195:50186.service: Deactivated successfully. Jan 29 16:14:24.846757 systemd[1]: session-13.scope: Deactivated successfully. Jan 29 16:14:24.850076 systemd-logind[1482]: Removed session 13. Jan 29 16:14:30.000921 systemd[1]: Started sshd@11-10.230.37.146:22-139.178.68.195:42546.service - OpenSSH per-connection server daemon (139.178.68.195:42546). Jan 29 16:14:30.903607 sshd[6506]: Accepted publickey for core from 139.178.68.195 port 42546 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:30.904710 sshd[6506]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:30.912883 systemd-logind[1482]: New session 14 of user core. Jan 29 16:14:30.926154 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 29 16:14:31.681776 sshd[6506]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:31.687361 systemd-logind[1482]: Session 14 logged out. Waiting for processes to exit. Jan 29 16:14:31.690577 systemd[1]: sshd@11-10.230.37.146:22-139.178.68.195:42546.service: Deactivated successfully. Jan 29 16:14:31.695014 systemd[1]: session-14.scope: Deactivated successfully. Jan 29 16:14:31.697371 systemd-logind[1482]: Removed session 14. Jan 29 16:14:31.841651 systemd[1]: Started sshd@12-10.230.37.146:22-139.178.68.195:42560.service - OpenSSH per-connection server daemon (139.178.68.195:42560). Jan 29 16:14:32.730644 sshd[6520]: Accepted publickey for core from 139.178.68.195 port 42560 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:32.731694 sshd[6520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:32.739525 systemd-logind[1482]: New session 15 of user core. Jan 29 16:14:32.748416 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 29 16:14:33.511851 sshd[6520]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:33.517820 systemd[1]: sshd@12-10.230.37.146:22-139.178.68.195:42560.service: Deactivated successfully. Jan 29 16:14:33.520405 systemd[1]: session-15.scope: Deactivated successfully. Jan 29 16:14:33.521811 systemd-logind[1482]: Session 15 logged out. Waiting for processes to exit. Jan 29 16:14:33.524021 systemd-logind[1482]: Removed session 15. Jan 29 16:14:33.677837 systemd[1]: Started sshd@13-10.230.37.146:22-139.178.68.195:42576.service - OpenSSH per-connection server daemon (139.178.68.195:42576). Jan 29 16:14:34.602187 sshd[6532]: Accepted publickey for core from 139.178.68.195 port 42576 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:34.604707 sshd[6532]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:34.612910 systemd-logind[1482]: New session 16 of user core. Jan 29 16:14:34.618352 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 29 16:14:35.318805 sshd[6532]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:35.323302 systemd-logind[1482]: Session 16 logged out. Waiting for processes to exit. Jan 29 16:14:35.323828 systemd[1]: sshd@13-10.230.37.146:22-139.178.68.195:42576.service: Deactivated successfully. Jan 29 16:14:35.327271 systemd[1]: session-16.scope: Deactivated successfully. Jan 29 16:14:35.329901 systemd-logind[1482]: Removed session 16. Jan 29 16:14:40.488602 systemd[1]: Started sshd@14-10.230.37.146:22-139.178.68.195:55598.service - OpenSSH per-connection server daemon (139.178.68.195:55598). Jan 29 16:14:41.375376 sshd[6573]: Accepted publickey for core from 139.178.68.195 port 55598 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:41.379323 sshd[6573]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:41.389265 systemd-logind[1482]: New session 17 of user core. Jan 29 16:14:41.394438 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 29 16:14:42.120830 sshd[6573]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:42.126775 systemd[1]: sshd@14-10.230.37.146:22-139.178.68.195:55598.service: Deactivated successfully. Jan 29 16:14:42.130044 systemd[1]: session-17.scope: Deactivated successfully. Jan 29 16:14:42.131703 systemd-logind[1482]: Session 17 logged out. Waiting for processes to exit. Jan 29 16:14:42.133420 systemd-logind[1482]: Removed session 17. Jan 29 16:14:47.285570 systemd[1]: Started sshd@15-10.230.37.146:22-139.178.68.195:36532.service - OpenSSH per-connection server daemon (139.178.68.195:36532). Jan 29 16:14:48.190211 sshd[6608]: Accepted publickey for core from 139.178.68.195 port 36532 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:48.193770 sshd[6608]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:48.203741 systemd-logind[1482]: New session 18 of user core. Jan 29 16:14:48.209371 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 29 16:14:48.922604 sshd[6608]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:48.928543 systemd[1]: sshd@15-10.230.37.146:22-139.178.68.195:36532.service: Deactivated successfully. Jan 29 16:14:48.931974 systemd[1]: session-18.scope: Deactivated successfully. Jan 29 16:14:48.934528 systemd-logind[1482]: Session 18 logged out. Waiting for processes to exit. Jan 29 16:14:48.936193 systemd-logind[1482]: Removed session 18. Jan 29 16:14:54.081579 systemd[1]: Started sshd@16-10.230.37.146:22-139.178.68.195:36548.service - OpenSSH per-connection server daemon (139.178.68.195:36548). Jan 29 16:14:54.998697 sshd[6623]: Accepted publickey for core from 139.178.68.195 port 36548 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:14:55.001197 sshd[6623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:14:55.008833 systemd-logind[1482]: New session 19 of user core. Jan 29 16:14:55.015963 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 29 16:14:55.718798 sshd[6623]: pam_unix(sshd:session): session closed for user core Jan 29 16:14:55.724578 systemd-logind[1482]: Session 19 logged out. Waiting for processes to exit. Jan 29 16:14:55.725930 systemd[1]: sshd@16-10.230.37.146:22-139.178.68.195:36548.service: Deactivated successfully. Jan 29 16:14:55.728938 systemd[1]: session-19.scope: Deactivated successfully. Jan 29 16:14:55.730672 systemd-logind[1482]: Removed session 19. Jan 29 16:15:00.880581 systemd[1]: Started sshd@17-10.230.37.146:22-139.178.68.195:51516.service - OpenSSH per-connection server daemon (139.178.68.195:51516). Jan 29 16:15:01.779220 sshd[6636]: Accepted publickey for core from 139.178.68.195 port 51516 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:01.781732 sshd[6636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:01.792857 systemd-logind[1482]: New session 20 of user core. Jan 29 16:15:01.799459 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 29 16:15:02.508899 sshd[6636]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:02.515709 systemd-logind[1482]: Session 20 logged out. Waiting for processes to exit. Jan 29 16:15:02.516456 systemd[1]: sshd@17-10.230.37.146:22-139.178.68.195:51516.service: Deactivated successfully. Jan 29 16:15:02.520626 systemd[1]: session-20.scope: Deactivated successfully. Jan 29 16:15:02.522216 systemd-logind[1482]: Removed session 20. Jan 29 16:15:02.665290 systemd[1]: Started sshd@18-10.230.37.146:22-139.178.68.195:51530.service - OpenSSH per-connection server daemon (139.178.68.195:51530). Jan 29 16:15:03.586809 sshd[6669]: Accepted publickey for core from 139.178.68.195 port 51530 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:03.589394 sshd[6669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:03.598108 systemd-logind[1482]: New session 21 of user core. Jan 29 16:15:03.603417 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 29 16:15:04.735939 sshd[6669]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:04.747838 systemd-logind[1482]: Session 21 logged out. Waiting for processes to exit. Jan 29 16:15:04.748299 systemd[1]: sshd@18-10.230.37.146:22-139.178.68.195:51530.service: Deactivated successfully. Jan 29 16:15:04.750842 systemd[1]: session-21.scope: Deactivated successfully. Jan 29 16:15:04.752671 systemd-logind[1482]: Removed session 21. Jan 29 16:15:04.895553 systemd[1]: Started sshd@19-10.230.37.146:22-139.178.68.195:51532.service - OpenSSH per-connection server daemon (139.178.68.195:51532). Jan 29 16:15:05.793284 sshd[6679]: Accepted publickey for core from 139.178.68.195 port 51532 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:05.798854 sshd[6679]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:05.812317 systemd-logind[1482]: New session 22 of user core. Jan 29 16:15:05.818374 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 29 16:15:06.088132 systemd[1]: run-containerd-runc-k8s.io-59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043-runc.l2raeK.mount: Deactivated successfully. Jan 29 16:15:09.594615 sshd[6679]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:09.613520 systemd[1]: sshd@19-10.230.37.146:22-139.178.68.195:51532.service: Deactivated successfully. Jan 29 16:15:09.617759 systemd[1]: session-22.scope: Deactivated successfully. Jan 29 16:15:09.620463 systemd-logind[1482]: Session 22 logged out. Waiting for processes to exit. Jan 29 16:15:09.622649 systemd-logind[1482]: Removed session 22. Jan 29 16:15:09.759820 systemd[1]: Started sshd@20-10.230.37.146:22-139.178.68.195:48588.service - OpenSSH per-connection server daemon (139.178.68.195:48588). Jan 29 16:15:10.721431 sshd[6728]: Accepted publickey for core from 139.178.68.195 port 48588 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:10.723612 sshd[6728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:10.738307 systemd-logind[1482]: New session 23 of user core. Jan 29 16:15:10.747452 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 29 16:15:11.963661 sshd[6728]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:11.970410 systemd[1]: sshd@20-10.230.37.146:22-139.178.68.195:48588.service: Deactivated successfully. Jan 29 16:15:11.970605 systemd-logind[1482]: Session 23 logged out. Waiting for processes to exit. Jan 29 16:15:11.976680 systemd[1]: session-23.scope: Deactivated successfully. Jan 29 16:15:11.980064 systemd-logind[1482]: Removed session 23. Jan 29 16:15:12.122787 systemd[1]: Started sshd@21-10.230.37.146:22-139.178.68.195:48594.service - OpenSSH per-connection server daemon (139.178.68.195:48594). Jan 29 16:15:13.049089 sshd[6739]: Accepted publickey for core from 139.178.68.195 port 48594 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:13.050775 sshd[6739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:13.060526 systemd-logind[1482]: New session 24 of user core. Jan 29 16:15:13.068397 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 29 16:15:13.771326 sshd[6739]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:13.776249 systemd-logind[1482]: Session 24 logged out. Waiting for processes to exit. Jan 29 16:15:13.777671 systemd[1]: sshd@21-10.230.37.146:22-139.178.68.195:48594.service: Deactivated successfully. Jan 29 16:15:13.780656 systemd[1]: session-24.scope: Deactivated successfully. Jan 29 16:15:13.782915 systemd-logind[1482]: Removed session 24. Jan 29 16:15:15.422645 update_engine[1483]: I20250129 16:15:15.422207 1483 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 29 16:15:15.422645 update_engine[1483]: I20250129 16:15:15.422384 1483 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 29 16:15:15.425264 update_engine[1483]: I20250129 16:15:15.425024 1483 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 29 16:15:15.427525 update_engine[1483]: I20250129 16:15:15.426954 1483 omaha_request_params.cc:62] Current group set to lts Jan 29 16:15:15.428189 update_engine[1483]: I20250129 16:15:15.428026 1483 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 29 16:15:15.428189 update_engine[1483]: I20250129 16:15:15.428069 1483 update_attempter.cc:643] Scheduling an action processor start. Jan 29 16:15:15.428189 update_engine[1483]: I20250129 16:15:15.428128 1483 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 29 16:15:15.429210 update_engine[1483]: I20250129 16:15:15.428675 1483 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 29 16:15:15.429210 update_engine[1483]: I20250129 16:15:15.428802 1483 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 29 16:15:15.429210 update_engine[1483]: I20250129 16:15:15.428820 1483 omaha_request_action.cc:272] Request: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: Jan 29 16:15:15.429210 update_engine[1483]: I20250129 16:15:15.428845 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 16:15:15.438376 update_engine[1483]: I20250129 16:15:15.437762 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 16:15:15.438376 update_engine[1483]: I20250129 16:15:15.438287 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 16:15:15.446411 update_engine[1483]: E20250129 16:15:15.446254 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 16:15:15.446411 update_engine[1483]: I20250129 16:15:15.446341 1483 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 29 16:15:15.473160 locksmithd[1515]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 29 16:15:18.937585 systemd[1]: Started sshd@22-10.230.37.146:22-139.178.68.195:50166.service - OpenSSH per-connection server daemon (139.178.68.195:50166). Jan 29 16:15:19.857206 sshd[6774]: Accepted publickey for core from 139.178.68.195 port 50166 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:19.860704 sshd[6774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:19.871699 systemd-logind[1482]: New session 25 of user core. Jan 29 16:15:19.880398 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 29 16:15:20.647462 sshd[6774]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:20.652614 systemd[1]: sshd@22-10.230.37.146:22-139.178.68.195:50166.service: Deactivated successfully. Jan 29 16:15:20.656952 systemd[1]: session-25.scope: Deactivated successfully. Jan 29 16:15:20.658719 systemd-logind[1482]: Session 25 logged out. Waiting for processes to exit. Jan 29 16:15:20.660345 systemd-logind[1482]: Removed session 25. Jan 29 16:15:25.333584 update_engine[1483]: I20250129 16:15:25.333348 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 16:15:25.334490 update_engine[1483]: I20250129 16:15:25.334120 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 16:15:25.334680 update_engine[1483]: I20250129 16:15:25.334640 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 16:15:25.340031 update_engine[1483]: E20250129 16:15:25.339975 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 16:15:25.340118 update_engine[1483]: I20250129 16:15:25.340078 1483 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 29 16:15:25.810615 systemd[1]: Started sshd@23-10.230.37.146:22-139.178.68.195:56388.service - OpenSSH per-connection server daemon (139.178.68.195:56388). Jan 29 16:15:26.747665 sshd[6812]: Accepted publickey for core from 139.178.68.195 port 56388 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:26.750569 sshd[6812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:26.759056 systemd-logind[1482]: New session 26 of user core. Jan 29 16:15:26.770453 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 29 16:15:27.506399 sshd[6812]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:27.513131 systemd[1]: sshd@23-10.230.37.146:22-139.178.68.195:56388.service: Deactivated successfully. Jan 29 16:15:27.516777 systemd[1]: session-26.scope: Deactivated successfully. Jan 29 16:15:27.519453 systemd-logind[1482]: Session 26 logged out. Waiting for processes to exit. Jan 29 16:15:27.522346 systemd-logind[1482]: Removed session 26. Jan 29 16:15:32.669556 systemd[1]: Started sshd@24-10.230.37.146:22-139.178.68.195:56390.service - OpenSSH per-connection server daemon (139.178.68.195:56390). Jan 29 16:15:33.590298 sshd[6825]: Accepted publickey for core from 139.178.68.195 port 56390 ssh2: RSA SHA256:iOkgT8Td6lnIZz4pNkw8ub6MVwYW40qiGb8+hDe1tnw Jan 29 16:15:33.594337 sshd[6825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 29 16:15:33.603760 systemd-logind[1482]: New session 27 of user core. Jan 29 16:15:33.613404 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 29 16:15:34.319319 sshd[6825]: pam_unix(sshd:session): session closed for user core Jan 29 16:15:34.324715 systemd[1]: sshd@24-10.230.37.146:22-139.178.68.195:56390.service: Deactivated successfully. Jan 29 16:15:34.325380 systemd-logind[1482]: Session 27 logged out. Waiting for processes to exit. Jan 29 16:15:34.328683 systemd[1]: session-27.scope: Deactivated successfully. Jan 29 16:15:34.331271 systemd-logind[1482]: Removed session 27. Jan 29 16:15:35.330973 update_engine[1483]: I20250129 16:15:35.330793 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 29 16:15:35.331815 update_engine[1483]: I20250129 16:15:35.331488 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 29 16:15:35.332191 update_engine[1483]: I20250129 16:15:35.331929 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 29 16:15:35.332500 update_engine[1483]: E20250129 16:15:35.332449 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jan 29 16:15:35.332576 update_engine[1483]: I20250129 16:15:35.332535 1483 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 29 16:15:36.093902 systemd[1]: run-containerd-runc-k8s.io-59dd0e3a035a458b9576572ff07cb1c4282b6030cd73f7c4b2b8573be02cd043-runc.Y8qwIQ.mount: Deactivated successfully.