Jan 17 13:42:23.052796 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p17) 13.3.1 20240614, GNU ld (Gentoo 2.42 p3) 2.42.0) #1 SMP PREEMPT_DYNAMIC Fri Jan 17 10:39:07 -00 2025 Jan 17 13:42:23.052835 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=bf1e0d81a0170850ab02d370c1a7c7a3f5983c980b3730f748240a3bda2dbb2e Jan 17 13:42:23.052850 kernel: BIOS-provided physical RAM map: Jan 17 13:42:23.052867 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jan 17 13:42:23.052877 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jan 17 13:42:23.052888 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jan 17 13:42:23.052899 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable Jan 17 13:42:23.052910 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved Jan 17 13:42:23.052920 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jan 17 13:42:23.052931 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jan 17 13:42:23.052942 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jan 17 13:42:23.052952 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jan 17 13:42:23.052968 kernel: NX (Execute Disable) protection: active Jan 17 13:42:23.052979 kernel: APIC: Static calls initialized Jan 17 13:42:23.052992 kernel: SMBIOS 2.8 present. Jan 17 13:42:23.053004 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 Jan 17 13:42:23.053016 kernel: Hypervisor detected: KVM Jan 17 13:42:23.053032 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jan 17 13:42:23.053043 kernel: kvm-clock: using sched offset of 4358304413 cycles Jan 17 13:42:23.053068 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jan 17 13:42:23.053080 kernel: tsc: Detected 2499.998 MHz processor Jan 17 13:42:23.053091 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 17 13:42:23.053103 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 17 13:42:23.053114 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 Jan 17 13:42:23.053125 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jan 17 13:42:23.053136 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 17 13:42:23.053152 kernel: Using GB pages for direct mapping Jan 17 13:42:23.053164 kernel: ACPI: Early table checksum verification disabled Jan 17 13:42:23.053175 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Jan 17 13:42:23.053203 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053215 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053226 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053238 kernel: ACPI: FACS 0x000000007FFDFD40 000040 Jan 17 13:42:23.053262 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053273 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053289 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053300 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 17 13:42:23.053324 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] Jan 17 13:42:23.053336 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] Jan 17 13:42:23.053348 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] Jan 17 13:42:23.053366 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] Jan 17 13:42:23.053378 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] Jan 17 13:42:23.053395 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] Jan 17 13:42:23.053408 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] Jan 17 13:42:23.053442 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 17 13:42:23.053455 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Jan 17 13:42:23.053470 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 17 13:42:23.053482 kernel: SRAT: PXM 0 -> APIC 0x03 -> Node 0 Jan 17 13:42:23.053494 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 17 13:42:23.053511 kernel: SRAT: PXM 0 -> APIC 0x05 -> Node 0 Jan 17 13:42:23.053524 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 17 13:42:23.053536 kernel: SRAT: PXM 0 -> APIC 0x07 -> Node 0 Jan 17 13:42:23.053548 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 17 13:42:23.053560 kernel: SRAT: PXM 0 -> APIC 0x09 -> Node 0 Jan 17 13:42:23.053572 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 17 13:42:23.053584 kernel: SRAT: PXM 0 -> APIC 0x0b -> Node 0 Jan 17 13:42:23.053596 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 17 13:42:23.053608 kernel: SRAT: PXM 0 -> APIC 0x0d -> Node 0 Jan 17 13:42:23.053620 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 17 13:42:23.053640 kernel: SRAT: PXM 0 -> APIC 0x0f -> Node 0 Jan 17 13:42:23.053652 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 17 13:42:23.053664 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 17 13:42:23.053676 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug Jan 17 13:42:23.053691 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00000000-0x7ffdbfff] Jan 17 13:42:23.053716 kernel: NODE_DATA(0) allocated [mem 0x7ffd6000-0x7ffdbfff] Jan 17 13:42:23.053728 kernel: Zone ranges: Jan 17 13:42:23.053740 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 17 13:42:23.053763 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] Jan 17 13:42:23.053795 kernel: Normal empty Jan 17 13:42:23.053807 kernel: Movable zone start for each node Jan 17 13:42:23.053820 kernel: Early memory node ranges Jan 17 13:42:23.053831 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jan 17 13:42:23.053843 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] Jan 17 13:42:23.053855 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] Jan 17 13:42:23.053867 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 17 13:42:23.053879 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jan 17 13:42:23.053892 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges Jan 17 13:42:23.053904 kernel: ACPI: PM-Timer IO Port: 0x608 Jan 17 13:42:23.053920 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jan 17 13:42:23.053933 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jan 17 13:42:23.053945 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jan 17 13:42:23.053957 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jan 17 13:42:23.053969 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jan 17 13:42:23.053981 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jan 17 13:42:23.053993 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jan 17 13:42:23.054005 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 17 13:42:23.054017 kernel: TSC deadline timer available Jan 17 13:42:23.054034 kernel: smpboot: Allowing 16 CPUs, 14 hotplug CPUs Jan 17 13:42:23.054046 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jan 17 13:42:23.054058 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jan 17 13:42:23.054070 kernel: Booting paravirtualized kernel on KVM Jan 17 13:42:23.054091 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 17 13:42:23.054103 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Jan 17 13:42:23.054115 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 17 13:42:23.054127 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 17 13:42:23.054139 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Jan 17 13:42:23.054163 kernel: kvm-guest: PV spinlocks enabled Jan 17 13:42:23.054175 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jan 17 13:42:23.054188 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=bf1e0d81a0170850ab02d370c1a7c7a3f5983c980b3730f748240a3bda2dbb2e Jan 17 13:42:23.054201 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 17 13:42:23.054213 kernel: random: crng init done Jan 17 13:42:23.054225 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 17 13:42:23.054238 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 17 13:42:23.054250 kernel: Fallback order for Node 0: 0 Jan 17 13:42:23.054279 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515804 Jan 17 13:42:23.054291 kernel: Policy zone: DMA32 Jan 17 13:42:23.054303 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 17 13:42:23.054315 kernel: software IO TLB: area num 16. Jan 17 13:42:23.054339 kernel: Memory: 1901528K/2096616K available (12288K kernel code, 2299K rwdata, 22728K rodata, 42848K init, 2344K bss, 194828K reserved, 0K cma-reserved) Jan 17 13:42:23.054352 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Jan 17 13:42:23.054364 kernel: Kernel/User page tables isolation: enabled Jan 17 13:42:23.054376 kernel: ftrace: allocating 37918 entries in 149 pages Jan 17 13:42:23.054388 kernel: ftrace: allocated 149 pages with 4 groups Jan 17 13:42:23.056426 kernel: Dynamic Preempt: voluntary Jan 17 13:42:23.056443 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 17 13:42:23.056457 kernel: rcu: RCU event tracing is enabled. Jan 17 13:42:23.056470 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Jan 17 13:42:23.056483 kernel: Trampoline variant of Tasks RCU enabled. Jan 17 13:42:23.056514 kernel: Rude variant of Tasks RCU enabled. Jan 17 13:42:23.056531 kernel: Tracing variant of Tasks RCU enabled. Jan 17 13:42:23.056544 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 17 13:42:23.056557 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Jan 17 13:42:23.056570 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 Jan 17 13:42:23.056583 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 17 13:42:23.056600 kernel: Console: colour VGA+ 80x25 Jan 17 13:42:23.056613 kernel: printk: console [tty0] enabled Jan 17 13:42:23.056626 kernel: printk: console [ttyS0] enabled Jan 17 13:42:23.056639 kernel: ACPI: Core revision 20230628 Jan 17 13:42:23.056652 kernel: APIC: Switch to symmetric I/O mode setup Jan 17 13:42:23.056665 kernel: x2apic enabled Jan 17 13:42:23.056683 kernel: APIC: Switched APIC routing to: physical x2apic Jan 17 13:42:23.056704 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 17 13:42:23.056717 kernel: Calibrating delay loop (skipped) preset value.. 4999.99 BogoMIPS (lpj=2499998) Jan 17 13:42:23.056730 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jan 17 13:42:23.056743 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Jan 17 13:42:23.056778 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Jan 17 13:42:23.056791 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 17 13:42:23.056804 kernel: Spectre V2 : Mitigation: Retpolines Jan 17 13:42:23.056817 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 17 13:42:23.056835 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Jan 17 13:42:23.056848 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls Jan 17 13:42:23.056861 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 17 13:42:23.056873 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 17 13:42:23.056886 kernel: MDS: Mitigation: Clear CPU buffers Jan 17 13:42:23.056899 kernel: MMIO Stale Data: Unknown: No mitigations Jan 17 13:42:23.056911 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 17 13:42:23.056924 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 17 13:42:23.056937 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 17 13:42:23.056950 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 17 13:42:23.056962 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 17 13:42:23.056980 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Jan 17 13:42:23.056993 kernel: Freeing SMP alternatives memory: 32K Jan 17 13:42:23.057005 kernel: pid_max: default: 32768 minimum: 301 Jan 17 13:42:23.057018 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 17 13:42:23.057031 kernel: landlock: Up and running. Jan 17 13:42:23.057043 kernel: SELinux: Initializing. Jan 17 13:42:23.057069 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 17 13:42:23.057081 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 17 13:42:23.057094 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) Jan 17 13:42:23.057106 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 17 13:42:23.057132 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 17 13:42:23.057150 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Jan 17 13:42:23.057163 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. Jan 17 13:42:23.057176 kernel: signal: max sigframe size: 1776 Jan 17 13:42:23.057189 kernel: rcu: Hierarchical SRCU implementation. Jan 17 13:42:23.057202 kernel: rcu: Max phase no-delay instances is 400. Jan 17 13:42:23.057215 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 17 13:42:23.057228 kernel: smp: Bringing up secondary CPUs ... Jan 17 13:42:23.057240 kernel: smpboot: x86: Booting SMP configuration: Jan 17 13:42:23.057253 kernel: .... node #0, CPUs: #1 Jan 17 13:42:23.057271 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 17 13:42:23.057283 kernel: smp: Brought up 1 node, 2 CPUs Jan 17 13:42:23.057296 kernel: smpboot: Max logical packages: 16 Jan 17 13:42:23.057309 kernel: smpboot: Total of 2 processors activated (9999.99 BogoMIPS) Jan 17 13:42:23.057322 kernel: devtmpfs: initialized Jan 17 13:42:23.057334 kernel: x86/mm: Memory block size: 128MB Jan 17 13:42:23.057347 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 17 13:42:23.057360 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Jan 17 13:42:23.057373 kernel: pinctrl core: initialized pinctrl subsystem Jan 17 13:42:23.057409 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 17 13:42:23.057422 kernel: audit: initializing netlink subsys (disabled) Jan 17 13:42:23.057435 kernel: audit: type=2000 audit(1737121341.712:1): state=initialized audit_enabled=0 res=1 Jan 17 13:42:23.057448 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 17 13:42:23.057461 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 17 13:42:23.057473 kernel: cpuidle: using governor menu Jan 17 13:42:23.057486 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 17 13:42:23.057499 kernel: dca service started, version 1.12.1 Jan 17 13:42:23.057512 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Jan 17 13:42:23.057531 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry Jan 17 13:42:23.057544 kernel: PCI: Using configuration type 1 for base access Jan 17 13:42:23.057557 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 17 13:42:23.057570 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 17 13:42:23.057583 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 17 13:42:23.057595 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 17 13:42:23.057609 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 17 13:42:23.057621 kernel: ACPI: Added _OSI(Module Device) Jan 17 13:42:23.057634 kernel: ACPI: Added _OSI(Processor Device) Jan 17 13:42:23.057652 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 17 13:42:23.057666 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 17 13:42:23.057679 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 17 13:42:23.057691 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 17 13:42:23.057704 kernel: ACPI: Interpreter enabled Jan 17 13:42:23.057717 kernel: ACPI: PM: (supports S0 S5) Jan 17 13:42:23.057729 kernel: ACPI: Using IOAPIC for interrupt routing Jan 17 13:42:23.057742 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 17 13:42:23.057765 kernel: PCI: Using E820 reservations for host bridge windows Jan 17 13:42:23.057784 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jan 17 13:42:23.057797 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 17 13:42:23.058059 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 17 13:42:23.058269 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 17 13:42:23.060506 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 17 13:42:23.060531 kernel: PCI host bridge to bus 0000:00 Jan 17 13:42:23.060773 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 17 13:42:23.060945 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jan 17 13:42:23.061113 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 17 13:42:23.061260 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] Jan 17 13:42:23.061428 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 17 13:42:23.061602 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] Jan 17 13:42:23.061804 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 17 13:42:23.062015 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Jan 17 13:42:23.062243 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 Jan 17 13:42:23.064481 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfa000000-0xfbffffff pref] Jan 17 13:42:23.064682 kernel: pci 0000:00:01.0: reg 0x14: [mem 0xfea50000-0xfea50fff] Jan 17 13:42:23.064882 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea40000-0xfea4ffff pref] Jan 17 13:42:23.065059 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 17 13:42:23.065297 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.067567 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea51000-0xfea51fff] Jan 17 13:42:23.067823 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.068002 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea52000-0xfea52fff] Jan 17 13:42:23.068214 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.068414 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea53000-0xfea53fff] Jan 17 13:42:23.068605 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.068819 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea54000-0xfea54fff] Jan 17 13:42:23.069039 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.069223 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea55000-0xfea55fff] Jan 17 13:42:23.069411 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.069564 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea56000-0xfea56fff] Jan 17 13:42:23.069776 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.069958 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea57000-0xfea57fff] Jan 17 13:42:23.070192 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Jan 17 13:42:23.070362 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea58000-0xfea58fff] Jan 17 13:42:23.072646 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Jan 17 13:42:23.072888 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc0c0-0xc0df] Jan 17 13:42:23.073064 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfea59000-0xfea59fff] Jan 17 13:42:23.073234 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Jan 17 13:42:23.076840 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfea00000-0xfea3ffff pref] Jan 17 13:42:23.077059 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Jan 17 13:42:23.077254 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Jan 17 13:42:23.077473 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfea5a000-0xfea5afff] Jan 17 13:42:23.077644 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfd004000-0xfd007fff 64bit pref] Jan 17 13:42:23.077860 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Jan 17 13:42:23.078030 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jan 17 13:42:23.078237 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Jan 17 13:42:23.078430 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc0e0-0xc0ff] Jan 17 13:42:23.078615 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea5b000-0xfea5bfff] Jan 17 13:42:23.078842 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Jan 17 13:42:23.079010 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Jan 17 13:42:23.079234 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 Jan 17 13:42:23.080499 kernel: pci 0000:01:00.0: reg 0x10: [mem 0xfda00000-0xfda000ff 64bit] Jan 17 13:42:23.080686 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 17 13:42:23.080879 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 17 13:42:23.081049 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 17 13:42:23.081261 kernel: pci_bus 0000:02: extended config space not accessible Jan 17 13:42:23.084759 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 Jan 17 13:42:23.084966 kernel: pci 0000:02:01.0: reg 0x10: [mem 0xfd800000-0xfd80000f] Jan 17 13:42:23.085147 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 17 13:42:23.085330 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 17 13:42:23.085614 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 Jan 17 13:42:23.085817 kernel: pci 0000:03:00.0: reg 0x10: [mem 0xfe800000-0xfe803fff 64bit] Jan 17 13:42:23.086031 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 17 13:42:23.086200 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 17 13:42:23.086401 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 17 13:42:23.086610 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 Jan 17 13:42:23.086823 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Jan 17 13:42:23.086993 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 17 13:42:23.087164 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 17 13:42:23.087340 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 17 13:42:23.088269 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 17 13:42:23.088480 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 17 13:42:23.088681 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 17 13:42:23.088880 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 17 13:42:23.089062 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 17 13:42:23.089249 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 17 13:42:23.089488 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 17 13:42:23.089677 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 17 13:42:23.089864 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 17 13:42:23.090044 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 17 13:42:23.090247 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 17 13:42:23.093533 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 17 13:42:23.093799 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 17 13:42:23.093977 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 17 13:42:23.094164 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 17 13:42:23.094183 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jan 17 13:42:23.094196 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jan 17 13:42:23.094221 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 17 13:42:23.094242 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jan 17 13:42:23.094255 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jan 17 13:42:23.094267 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jan 17 13:42:23.094280 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jan 17 13:42:23.094305 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jan 17 13:42:23.094318 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jan 17 13:42:23.094332 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jan 17 13:42:23.094344 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jan 17 13:42:23.094357 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jan 17 13:42:23.094376 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jan 17 13:42:23.094389 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jan 17 13:42:23.094402 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jan 17 13:42:23.094438 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jan 17 13:42:23.094452 kernel: iommu: Default domain type: Translated Jan 17 13:42:23.094465 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 17 13:42:23.094478 kernel: PCI: Using ACPI for IRQ routing Jan 17 13:42:23.094491 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 17 13:42:23.094504 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jan 17 13:42:23.094524 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] Jan 17 13:42:23.094693 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jan 17 13:42:23.094878 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jan 17 13:42:23.095045 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 17 13:42:23.095065 kernel: vgaarb: loaded Jan 17 13:42:23.095079 kernel: clocksource: Switched to clocksource kvm-clock Jan 17 13:42:23.095092 kernel: VFS: Disk quotas dquot_6.6.0 Jan 17 13:42:23.095106 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 17 13:42:23.095126 kernel: pnp: PnP ACPI init Jan 17 13:42:23.095345 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jan 17 13:42:23.097501 kernel: pnp: PnP ACPI: found 5 devices Jan 17 13:42:23.097521 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 17 13:42:23.097535 kernel: NET: Registered PF_INET protocol family Jan 17 13:42:23.097549 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 17 13:42:23.097562 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 17 13:42:23.097575 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 17 13:42:23.097588 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 17 13:42:23.097611 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 17 13:42:23.097624 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 17 13:42:23.097637 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 17 13:42:23.097650 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 17 13:42:23.097663 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 17 13:42:23.097676 kernel: NET: Registered PF_XDP protocol family Jan 17 13:42:23.097875 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 Jan 17 13:42:23.098064 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 17 13:42:23.098262 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 17 13:42:23.098494 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 17 13:42:23.098678 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 17 13:42:23.098863 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 17 13:42:23.099043 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 17 13:42:23.099207 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 17 13:42:23.100382 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Jan 17 13:42:23.100604 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Jan 17 13:42:23.100805 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Jan 17 13:42:23.100978 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Jan 17 13:42:23.101159 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Jan 17 13:42:23.101339 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Jan 17 13:42:23.103554 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Jan 17 13:42:23.103743 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Jan 17 13:42:23.103966 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] Jan 17 13:42:23.104165 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] Jan 17 13:42:23.104339 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] Jan 17 13:42:23.106569 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 17 13:42:23.106747 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] Jan 17 13:42:23.106945 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jan 17 13:42:23.107116 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] Jan 17 13:42:23.107295 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 17 13:42:23.107499 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] Jan 17 13:42:23.107678 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 17 13:42:23.107867 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] Jan 17 13:42:23.108040 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 17 13:42:23.108220 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] Jan 17 13:42:23.109524 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 17 13:42:23.109804 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] Jan 17 13:42:23.109987 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 17 13:42:23.110196 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] Jan 17 13:42:23.110381 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 17 13:42:23.110590 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] Jan 17 13:42:23.110796 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 17 13:42:23.110984 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] Jan 17 13:42:23.111157 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 17 13:42:23.111345 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] Jan 17 13:42:23.111562 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 17 13:42:23.111764 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] Jan 17 13:42:23.111950 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 17 13:42:23.112148 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] Jan 17 13:42:23.112310 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 17 13:42:23.113514 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] Jan 17 13:42:23.113685 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 17 13:42:23.113871 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] Jan 17 13:42:23.114040 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 17 13:42:23.114210 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] Jan 17 13:42:23.116417 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 17 13:42:23.116588 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jan 17 13:42:23.116742 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jan 17 13:42:23.116919 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jan 17 13:42:23.117071 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] Jan 17 13:42:23.117222 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jan 17 13:42:23.117390 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] Jan 17 13:42:23.117579 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 17 13:42:23.117744 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] Jan 17 13:42:23.117922 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jan 17 13:42:23.118107 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] Jan 17 13:42:23.118291 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] Jan 17 13:42:23.119514 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] Jan 17 13:42:23.119687 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jan 17 13:42:23.119886 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] Jan 17 13:42:23.120058 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] Jan 17 13:42:23.120220 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jan 17 13:42:23.121461 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] Jan 17 13:42:23.121633 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] Jan 17 13:42:23.121808 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jan 17 13:42:23.121999 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] Jan 17 13:42:23.122162 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] Jan 17 13:42:23.122348 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jan 17 13:42:23.124582 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] Jan 17 13:42:23.124768 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] Jan 17 13:42:23.124931 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jan 17 13:42:23.125101 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] Jan 17 13:42:23.125263 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] Jan 17 13:42:23.125441 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jan 17 13:42:23.125624 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] Jan 17 13:42:23.125801 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] Jan 17 13:42:23.125971 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jan 17 13:42:23.125993 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jan 17 13:42:23.126007 kernel: PCI: CLS 0 bytes, default 64 Jan 17 13:42:23.126021 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Jan 17 13:42:23.126035 kernel: software IO TLB: mapped [mem 0x0000000073000000-0x0000000077000000] (64MB) Jan 17 13:42:23.126049 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 17 13:42:23.126063 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x240937b9988, max_idle_ns: 440795218083 ns Jan 17 13:42:23.126076 kernel: Initialise system trusted keyrings Jan 17 13:42:23.126097 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 17 13:42:23.126111 kernel: Key type asymmetric registered Jan 17 13:42:23.126125 kernel: Asymmetric key parser 'x509' registered Jan 17 13:42:23.126138 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 17 13:42:23.126153 kernel: io scheduler mq-deadline registered Jan 17 13:42:23.126166 kernel: io scheduler kyber registered Jan 17 13:42:23.126180 kernel: io scheduler bfq registered Jan 17 13:42:23.126353 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jan 17 13:42:23.128722 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jan 17 13:42:23.128955 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.129224 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jan 17 13:42:23.129449 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jan 17 13:42:23.129627 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.129839 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jan 17 13:42:23.130013 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jan 17 13:42:23.130204 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.132406 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jan 17 13:42:23.132587 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jan 17 13:42:23.132795 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.132968 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jan 17 13:42:23.133136 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jan 17 13:42:23.133335 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.133601 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jan 17 13:42:23.133788 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jan 17 13:42:23.133958 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.134140 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jan 17 13:42:23.134311 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jan 17 13:42:23.134544 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.134720 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jan 17 13:42:23.134917 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jan 17 13:42:23.135086 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 17 13:42:23.135108 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 17 13:42:23.135123 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jan 17 13:42:23.135145 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jan 17 13:42:23.135159 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 17 13:42:23.135178 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 17 13:42:23.135192 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jan 17 13:42:23.135231 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 17 13:42:23.135244 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 17 13:42:23.135508 kernel: rtc_cmos 00:03: RTC can wake from S4 Jan 17 13:42:23.135531 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 17 13:42:23.135694 kernel: rtc_cmos 00:03: registered as rtc0 Jan 17 13:42:23.135869 kernel: rtc_cmos 00:03: setting system clock to 2025-01-17T13:42:22 UTC (1737121342) Jan 17 13:42:23.136027 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram Jan 17 13:42:23.136047 kernel: intel_pstate: CPU model not supported Jan 17 13:42:23.136061 kernel: NET: Registered PF_INET6 protocol family Jan 17 13:42:23.136075 kernel: Segment Routing with IPv6 Jan 17 13:42:23.136088 kernel: In-situ OAM (IOAM) with IPv6 Jan 17 13:42:23.136102 kernel: NET: Registered PF_PACKET protocol family Jan 17 13:42:23.136115 kernel: Key type dns_resolver registered Jan 17 13:42:23.136145 kernel: IPI shorthand broadcast: enabled Jan 17 13:42:23.136159 kernel: sched_clock: Marking stable (1193036844, 235894439)->(1671390082, -242458799) Jan 17 13:42:23.136173 kernel: registered taskstats version 1 Jan 17 13:42:23.136187 kernel: Loading compiled-in X.509 certificates Jan 17 13:42:23.136201 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 6baa290b0089ed5c4c5f7248306af816ac8c7f80' Jan 17 13:42:23.136214 kernel: Key type .fscrypt registered Jan 17 13:42:23.136228 kernel: Key type fscrypt-provisioning registered Jan 17 13:42:23.136241 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 17 13:42:23.136260 kernel: ima: Allocated hash algorithm: sha1 Jan 17 13:42:23.136274 kernel: ima: No architecture policies found Jan 17 13:42:23.136300 kernel: clk: Disabling unused clocks Jan 17 13:42:23.136313 kernel: Freeing unused kernel image (initmem) memory: 42848K Jan 17 13:42:23.136326 kernel: Write protecting the kernel read-only data: 36864k Jan 17 13:42:23.136340 kernel: Freeing unused kernel image (rodata/data gap) memory: 1848K Jan 17 13:42:23.136366 kernel: Run /init as init process Jan 17 13:42:23.136430 kernel: with arguments: Jan 17 13:42:23.136445 kernel: /init Jan 17 13:42:23.136458 kernel: with environment: Jan 17 13:42:23.136479 kernel: HOME=/ Jan 17 13:42:23.136492 kernel: TERM=linux Jan 17 13:42:23.136506 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 17 13:42:23.136523 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 17 13:42:23.136540 systemd[1]: Detected virtualization kvm. Jan 17 13:42:23.136555 systemd[1]: Detected architecture x86-64. Jan 17 13:42:23.136568 systemd[1]: Running in initrd. Jan 17 13:42:23.136588 systemd[1]: No hostname configured, using default hostname. Jan 17 13:42:23.136602 systemd[1]: Hostname set to . Jan 17 13:42:23.136616 systemd[1]: Initializing machine ID from VM UUID. Jan 17 13:42:23.136631 systemd[1]: Queued start job for default target initrd.target. Jan 17 13:42:23.136645 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 17 13:42:23.136659 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 17 13:42:23.136674 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 17 13:42:23.136689 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 17 13:42:23.136709 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 17 13:42:23.136724 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 17 13:42:23.136740 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 17 13:42:23.136768 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 17 13:42:23.136783 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 17 13:42:23.136798 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 17 13:42:23.136812 systemd[1]: Reached target paths.target - Path Units. Jan 17 13:42:23.136833 systemd[1]: Reached target slices.target - Slice Units. Jan 17 13:42:23.136847 systemd[1]: Reached target swap.target - Swaps. Jan 17 13:42:23.136861 systemd[1]: Reached target timers.target - Timer Units. Jan 17 13:42:23.136876 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 17 13:42:23.136890 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 17 13:42:23.136905 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 17 13:42:23.136920 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 17 13:42:23.136934 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 17 13:42:23.136948 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 17 13:42:23.136968 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 17 13:42:23.136983 systemd[1]: Reached target sockets.target - Socket Units. Jan 17 13:42:23.136998 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 17 13:42:23.137017 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 17 13:42:23.137039 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 17 13:42:23.137053 systemd[1]: Starting systemd-fsck-usr.service... Jan 17 13:42:23.137067 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 17 13:42:23.137082 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 17 13:42:23.137154 systemd-journald[201]: Collecting audit messages is disabled. Jan 17 13:42:23.137194 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 17 13:42:23.137209 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 17 13:42:23.137224 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 17 13:42:23.137245 systemd[1]: Finished systemd-fsck-usr.service. Jan 17 13:42:23.137261 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 17 13:42:23.137275 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 17 13:42:23.137289 kernel: Bridge firewalling registered Jan 17 13:42:23.137312 systemd-journald[201]: Journal started Jan 17 13:42:23.137345 systemd-journald[201]: Runtime Journal (/run/log/journal/0560f9da3e38428d97c85261f836974d) is 4.7M, max 38.0M, 33.2M free. Jan 17 13:42:23.090472 systemd-modules-load[202]: Inserted module 'overlay' Jan 17 13:42:23.131771 systemd-modules-load[202]: Inserted module 'br_netfilter' Jan 17 13:42:23.175390 systemd[1]: Started systemd-journald.service - Journal Service. Jan 17 13:42:23.176136 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 17 13:42:23.178275 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 17 13:42:23.180461 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 17 13:42:23.196553 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 17 13:42:23.206665 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 17 13:42:23.210392 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 17 13:42:23.224156 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 17 13:42:23.225446 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 17 13:42:23.242470 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 17 13:42:23.247289 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 17 13:42:23.253549 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 17 13:42:23.255513 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 17 13:42:23.262115 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 17 13:42:23.291534 dracut-cmdline[235]: dracut-dracut-053 Jan 17 13:42:23.299929 dracut-cmdline[235]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=bf1e0d81a0170850ab02d370c1a7c7a3f5983c980b3730f748240a3bda2dbb2e Jan 17 13:42:23.302303 systemd-resolved[234]: Positive Trust Anchors: Jan 17 13:42:23.302328 systemd-resolved[234]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 17 13:42:23.302426 systemd-resolved[234]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 17 13:42:23.307481 systemd-resolved[234]: Defaulting to hostname 'linux'. Jan 17 13:42:23.309990 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 17 13:42:23.311755 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 17 13:42:23.406456 kernel: SCSI subsystem initialized Jan 17 13:42:23.418390 kernel: Loading iSCSI transport class v2.0-870. Jan 17 13:42:23.432420 kernel: iscsi: registered transport (tcp) Jan 17 13:42:23.459663 kernel: iscsi: registered transport (qla4xxx) Jan 17 13:42:23.459750 kernel: QLogic iSCSI HBA Driver Jan 17 13:42:23.514665 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 17 13:42:23.528344 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 17 13:42:23.574397 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 17 13:42:23.576830 kernel: device-mapper: uevent: version 1.0.3 Jan 17 13:42:23.576879 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 17 13:42:23.630411 kernel: raid6: sse2x4 gen() 12851 MB/s Jan 17 13:42:23.645440 kernel: raid6: sse2x2 gen() 8762 MB/s Jan 17 13:42:23.664200 kernel: raid6: sse2x1 gen() 9659 MB/s Jan 17 13:42:23.664255 kernel: raid6: using algorithm sse2x4 gen() 12851 MB/s Jan 17 13:42:23.683112 kernel: raid6: .... xor() 7521 MB/s, rmw enabled Jan 17 13:42:23.683168 kernel: raid6: using ssse3x2 recovery algorithm Jan 17 13:42:23.709429 kernel: xor: automatically using best checksumming function avx Jan 17 13:42:23.907763 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 17 13:42:23.922559 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 17 13:42:23.932631 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 17 13:42:23.950752 systemd-udevd[418]: Using default interface naming scheme 'v255'. Jan 17 13:42:23.958331 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 17 13:42:23.966582 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 17 13:42:23.989334 dracut-pre-trigger[423]: rd.md=0: removing MD RAID activation Jan 17 13:42:24.027647 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 17 13:42:24.034636 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 17 13:42:24.147666 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 17 13:42:24.157813 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 17 13:42:24.187133 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 17 13:42:24.191294 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 17 13:42:24.192125 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 17 13:42:24.193514 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 17 13:42:24.203369 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 17 13:42:24.229204 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 17 13:42:24.291685 kernel: ACPI: bus type USB registered Jan 17 13:42:24.297425 kernel: usbcore: registered new interface driver usbfs Jan 17 13:42:24.302402 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues Jan 17 13:42:24.333960 kernel: cryptd: max_cpu_qlen set to 1000 Jan 17 13:42:24.333989 kernel: usbcore: registered new interface driver hub Jan 17 13:42:24.334029 kernel: usbcore: registered new device driver usb Jan 17 13:42:24.334051 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) Jan 17 13:42:24.334255 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 17 13:42:24.334276 kernel: GPT:17805311 != 125829119 Jan 17 13:42:24.334294 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 17 13:42:24.334311 kernel: GPT:17805311 != 125829119 Jan 17 13:42:24.334327 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 17 13:42:24.334345 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 17 13:42:24.320072 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 17 13:42:24.320292 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 17 13:42:24.332442 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 17 13:42:24.333746 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 17 13:42:24.333955 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 17 13:42:24.335019 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 17 13:42:24.349702 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 17 13:42:24.358387 kernel: AVX version of gcm_enc/dec engaged. Jan 17 13:42:24.358419 kernel: AES CTR mode by8 optimization enabled Jan 17 13:42:24.387729 kernel: libata version 3.00 loaded. Jan 17 13:42:24.410733 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jan 17 13:42:24.514460 kernel: BTRFS: device fsid e459b8ee-f1f7-4c3d-a087-3f1955f52c85 devid 1 transid 36 /dev/vda3 scanned by (udev-worker) (468) Jan 17 13:42:24.514507 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (464) Jan 17 13:42:24.514541 kernel: ahci 0000:00:1f.2: version 3.0 Jan 17 13:42:24.514857 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jan 17 13:42:24.514897 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Jan 17 13:42:24.515104 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jan 17 13:42:24.515311 kernel: scsi host0: ahci Jan 17 13:42:24.516349 kernel: scsi host1: ahci Jan 17 13:42:24.516575 kernel: scsi host2: ahci Jan 17 13:42:24.516811 kernel: scsi host3: ahci Jan 17 13:42:24.517024 kernel: scsi host4: ahci Jan 17 13:42:24.517228 kernel: scsi host5: ahci Jan 17 13:42:24.517441 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 38 Jan 17 13:42:24.517463 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 38 Jan 17 13:42:24.517481 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 38 Jan 17 13:42:24.517499 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 38 Jan 17 13:42:24.517524 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 38 Jan 17 13:42:24.517543 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 38 Jan 17 13:42:24.516966 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 17 13:42:24.530383 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jan 17 13:42:24.542365 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jan 17 13:42:24.543260 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jan 17 13:42:24.556616 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 17 13:42:24.568560 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 17 13:42:24.573548 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 17 13:42:24.575784 disk-uuid[556]: Primary Header is updated. Jan 17 13:42:24.575784 disk-uuid[556]: Secondary Entries is updated. Jan 17 13:42:24.575784 disk-uuid[556]: Secondary Header is updated. Jan 17 13:42:24.581435 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 17 13:42:24.590399 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 17 13:42:24.614572 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 17 13:42:24.749398 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jan 17 13:42:24.749466 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jan 17 13:42:24.752514 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jan 17 13:42:24.753401 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jan 17 13:42:24.756157 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jan 17 13:42:24.761412 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jan 17 13:42:24.773345 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 17 13:42:24.791261 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 Jan 17 13:42:24.791512 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 17 13:42:24.791736 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller Jan 17 13:42:24.791948 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 Jan 17 13:42:24.792166 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed Jan 17 13:42:24.792407 kernel: hub 1-0:1.0: USB hub found Jan 17 13:42:24.792631 kernel: hub 1-0:1.0: 4 ports detected Jan 17 13:42:24.792850 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 17 13:42:24.793081 kernel: hub 2-0:1.0: USB hub found Jan 17 13:42:24.793291 kernel: hub 2-0:1.0: 4 ports detected Jan 17 13:42:25.025419 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 17 13:42:25.168387 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 17 13:42:25.174189 kernel: usbcore: registered new interface driver usbhid Jan 17 13:42:25.174225 kernel: usbhid: USB HID core driver Jan 17 13:42:25.182231 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Jan 17 13:42:25.182275 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 Jan 17 13:42:25.591389 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jan 17 13:42:25.593657 disk-uuid[557]: The operation has completed successfully. Jan 17 13:42:25.643549 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 17 13:42:25.643750 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 17 13:42:25.672562 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 17 13:42:25.687660 sh[580]: Success Jan 17 13:42:25.706555 kernel: device-mapper: verity: sha256 using implementation "sha256-avx" Jan 17 13:42:25.767299 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 17 13:42:25.779526 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 17 13:42:25.781476 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 17 13:42:25.803417 kernel: BTRFS info (device dm-0): first mount of filesystem e459b8ee-f1f7-4c3d-a087-3f1955f52c85 Jan 17 13:42:25.803498 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 17 13:42:25.806800 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 17 13:42:25.806839 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 17 13:42:25.808500 kernel: BTRFS info (device dm-0): using free space tree Jan 17 13:42:25.818500 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 17 13:42:25.819962 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 17 13:42:25.825538 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 17 13:42:25.831661 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 17 13:42:25.853626 kernel: BTRFS info (device vda6): first mount of filesystem a70a40d6-5ab2-4665-81b1-b8e9f58c5ff8 Jan 17 13:42:25.853703 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 17 13:42:25.853727 kernel: BTRFS info (device vda6): using free space tree Jan 17 13:42:25.863442 kernel: BTRFS info (device vda6): auto enabling async discard Jan 17 13:42:25.879235 kernel: BTRFS info (device vda6): last unmount of filesystem a70a40d6-5ab2-4665-81b1-b8e9f58c5ff8 Jan 17 13:42:25.878845 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 17 13:42:25.886785 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 17 13:42:25.898711 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 17 13:42:25.977910 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 17 13:42:25.987641 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 17 13:42:26.032864 systemd-networkd[765]: lo: Link UP Jan 17 13:42:26.033944 systemd-networkd[765]: lo: Gained carrier Jan 17 13:42:26.037129 systemd-networkd[765]: Enumeration completed Jan 17 13:42:26.037302 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 17 13:42:26.039792 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 17 13:42:26.039798 systemd-networkd[765]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 17 13:42:26.040168 systemd[1]: Reached target network.target - Network. Jan 17 13:42:26.041184 systemd-networkd[765]: eth0: Link UP Jan 17 13:42:26.047327 ignition[687]: Ignition 2.19.0 Jan 17 13:42:26.041192 systemd-networkd[765]: eth0: Gained carrier Jan 17 13:42:26.047346 ignition[687]: Stage: fetch-offline Jan 17 13:42:26.041211 systemd-networkd[765]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 17 13:42:26.047450 ignition[687]: no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:26.050209 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 17 13:42:26.047476 ignition[687]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:26.047661 ignition[687]: parsed url from cmdline: "" Jan 17 13:42:26.047668 ignition[687]: no config URL provided Jan 17 13:42:26.047692 ignition[687]: reading system config file "/usr/lib/ignition/user.ign" Jan 17 13:42:26.056579 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 17 13:42:26.047708 ignition[687]: no config at "/usr/lib/ignition/user.ign" Jan 17 13:42:26.047717 ignition[687]: failed to fetch config: resource requires networking Jan 17 13:42:26.048153 ignition[687]: Ignition finished successfully Jan 17 13:42:26.066476 systemd-networkd[765]: eth0: DHCPv4 address 10.230.9.254/30, gateway 10.230.9.253 acquired from 10.230.9.253 Jan 17 13:42:26.078007 ignition[772]: Ignition 2.19.0 Jan 17 13:42:26.078034 ignition[772]: Stage: fetch Jan 17 13:42:26.078301 ignition[772]: no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:26.078322 ignition[772]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:26.078494 ignition[772]: parsed url from cmdline: "" Jan 17 13:42:26.078502 ignition[772]: no config URL provided Jan 17 13:42:26.078512 ignition[772]: reading system config file "/usr/lib/ignition/user.ign" Jan 17 13:42:26.078528 ignition[772]: no config at "/usr/lib/ignition/user.ign" Jan 17 13:42:26.078797 ignition[772]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Jan 17 13:42:26.078820 ignition[772]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Jan 17 13:42:26.078871 ignition[772]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Jan 17 13:42:26.097163 ignition[772]: GET result: OK Jan 17 13:42:26.098114 ignition[772]: parsing config with SHA512: dd9df7e5f3c9ac89455e82c90d2303ec3d57488c699e43c3424bc1f171f9ff065801882f9473f26c2e03a6ab9d7479c82c993f9ea941fb68bd69d1cdc08cc1c4 Jan 17 13:42:26.103996 unknown[772]: fetched base config from "system" Jan 17 13:42:26.104014 unknown[772]: fetched base config from "system" Jan 17 13:42:26.104571 ignition[772]: fetch: fetch complete Jan 17 13:42:26.104023 unknown[772]: fetched user config from "openstack" Jan 17 13:42:26.104580 ignition[772]: fetch: fetch passed Jan 17 13:42:26.108451 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 17 13:42:26.104647 ignition[772]: Ignition finished successfully Jan 17 13:42:26.115562 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 17 13:42:26.143279 ignition[779]: Ignition 2.19.0 Jan 17 13:42:26.143301 ignition[779]: Stage: kargs Jan 17 13:42:26.143571 ignition[779]: no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:26.143592 ignition[779]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:26.146603 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 17 13:42:26.145240 ignition[779]: kargs: kargs passed Jan 17 13:42:26.145312 ignition[779]: Ignition finished successfully Jan 17 13:42:26.154624 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 17 13:42:26.172059 ignition[785]: Ignition 2.19.0 Jan 17 13:42:26.172071 ignition[785]: Stage: disks Jan 17 13:42:26.172309 ignition[785]: no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:26.174804 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 17 13:42:26.172329 ignition[785]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:26.173422 ignition[785]: disks: disks passed Jan 17 13:42:26.176811 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 17 13:42:26.173496 ignition[785]: Ignition finished successfully Jan 17 13:42:26.178423 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 17 13:42:26.179801 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 17 13:42:26.181318 systemd[1]: Reached target sysinit.target - System Initialization. Jan 17 13:42:26.182651 systemd[1]: Reached target basic.target - Basic System. Jan 17 13:42:26.190547 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 17 13:42:26.208738 systemd-fsck[793]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 17 13:42:26.212150 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 17 13:42:26.217471 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 17 13:42:26.345414 kernel: EXT4-fs (vda9): mounted filesystem 0ba4fe0e-76d7-406f-b570-4642d86198f6 r/w with ordered data mode. Quota mode: none. Jan 17 13:42:26.346500 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 17 13:42:26.347808 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 17 13:42:26.356551 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 17 13:42:26.359867 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 17 13:42:26.361895 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 17 13:42:26.363583 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Jan 17 13:42:26.369483 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (801) Jan 17 13:42:26.369528 kernel: BTRFS info (device vda6): first mount of filesystem a70a40d6-5ab2-4665-81b1-b8e9f58c5ff8 Jan 17 13:42:26.377045 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 17 13:42:26.377088 kernel: BTRFS info (device vda6): using free space tree Jan 17 13:42:26.377584 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 17 13:42:26.377638 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 17 13:42:26.382013 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 17 13:42:26.385473 kernel: BTRFS info (device vda6): auto enabling async discard Jan 17 13:42:26.391585 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 17 13:42:26.399688 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 17 13:42:26.476737 initrd-setup-root[829]: cut: /sysroot/etc/passwd: No such file or directory Jan 17 13:42:26.485225 initrd-setup-root[836]: cut: /sysroot/etc/group: No such file or directory Jan 17 13:42:26.493858 initrd-setup-root[843]: cut: /sysroot/etc/shadow: No such file or directory Jan 17 13:42:26.505048 initrd-setup-root[850]: cut: /sysroot/etc/gshadow: No such file or directory Jan 17 13:42:26.611470 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 17 13:42:26.618467 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 17 13:42:26.620601 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 17 13:42:26.633450 kernel: BTRFS info (device vda6): last unmount of filesystem a70a40d6-5ab2-4665-81b1-b8e9f58c5ff8 Jan 17 13:42:26.668233 ignition[917]: INFO : Ignition 2.19.0 Jan 17 13:42:26.669415 ignition[917]: INFO : Stage: mount Jan 17 13:42:26.669415 ignition[917]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:26.669415 ignition[917]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:26.671927 ignition[917]: INFO : mount: mount passed Jan 17 13:42:26.671927 ignition[917]: INFO : Ignition finished successfully Jan 17 13:42:26.672344 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 17 13:42:26.673966 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 17 13:42:26.801149 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 17 13:42:27.381737 systemd-networkd[765]: eth0: Gained IPv6LL Jan 17 13:42:28.887773 systemd-networkd[765]: eth0: Ignoring DHCPv6 address 2a02:1348:179:827f:24:19ff:fee6:9fe/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:827f:24:19ff:fee6:9fe/64 assigned by NDisc. Jan 17 13:42:28.887791 systemd-networkd[765]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 17 13:42:33.539178 coreos-metadata[803]: Jan 17 13:42:33.539 WARN failed to locate config-drive, using the metadata service API instead Jan 17 13:42:33.564246 coreos-metadata[803]: Jan 17 13:42:33.564 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 17 13:42:33.585525 coreos-metadata[803]: Jan 17 13:42:33.585 INFO Fetch successful Jan 17 13:42:33.586437 coreos-metadata[803]: Jan 17 13:42:33.586 INFO wrote hostname srv-so9hk.gb1.brightbox.com to /sysroot/etc/hostname Jan 17 13:42:33.588599 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Jan 17 13:42:33.588765 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Jan 17 13:42:33.595538 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 17 13:42:33.611607 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 17 13:42:33.625429 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (934) Jan 17 13:42:33.628879 kernel: BTRFS info (device vda6): first mount of filesystem a70a40d6-5ab2-4665-81b1-b8e9f58c5ff8 Jan 17 13:42:33.628919 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jan 17 13:42:33.630818 kernel: BTRFS info (device vda6): using free space tree Jan 17 13:42:33.636423 kernel: BTRFS info (device vda6): auto enabling async discard Jan 17 13:42:33.639304 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 17 13:42:33.676301 ignition[952]: INFO : Ignition 2.19.0 Jan 17 13:42:33.676301 ignition[952]: INFO : Stage: files Jan 17 13:42:33.678087 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:33.678087 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:33.678087 ignition[952]: DEBUG : files: compiled without relabeling support, skipping Jan 17 13:42:33.680881 ignition[952]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 17 13:42:33.680881 ignition[952]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 17 13:42:33.682982 ignition[952]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 17 13:42:33.683996 ignition[952]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 17 13:42:33.683996 ignition[952]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 17 13:42:33.683681 unknown[952]: wrote ssh authorized keys file for user: core Jan 17 13:42:33.686899 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 17 13:42:33.686899 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 17 13:42:33.917684 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 17 13:42:34.279657 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 17 13:42:34.281443 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 17 13:42:34.299203 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 17 13:42:34.299203 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 17 13:42:34.299203 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 17 13:42:34.299203 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Jan 17 13:42:34.929603 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 17 13:42:36.934705 ignition[952]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Jan 17 13:42:36.934705 ignition[952]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 17 13:42:36.937555 ignition[952]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 17 13:42:36.937555 ignition[952]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 17 13:42:36.937555 ignition[952]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 17 13:42:36.937555 ignition[952]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Jan 17 13:42:36.937555 ignition[952]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Jan 17 13:42:36.947539 ignition[952]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 17 13:42:36.947539 ignition[952]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 17 13:42:36.947539 ignition[952]: INFO : files: files passed Jan 17 13:42:36.947539 ignition[952]: INFO : Ignition finished successfully Jan 17 13:42:36.940475 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 17 13:42:36.953623 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 17 13:42:36.965686 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 17 13:42:36.971595 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 17 13:42:36.971819 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 17 13:42:36.982560 initrd-setup-root-after-ignition[980]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 17 13:42:36.982560 initrd-setup-root-after-ignition[980]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 17 13:42:36.988623 initrd-setup-root-after-ignition[984]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 17 13:42:36.990466 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 17 13:42:36.992536 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 17 13:42:37.003660 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 17 13:42:37.037321 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 17 13:42:37.037542 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 17 13:42:37.039854 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 17 13:42:37.041122 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 17 13:42:37.042924 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 17 13:42:37.047646 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 17 13:42:37.069784 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 17 13:42:37.076591 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 17 13:42:37.092655 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 17 13:42:37.093583 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 17 13:42:37.096572 systemd[1]: Stopped target timers.target - Timer Units. Jan 17 13:42:37.097419 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 17 13:42:37.097595 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 17 13:42:37.100009 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 17 13:42:37.101013 systemd[1]: Stopped target basic.target - Basic System. Jan 17 13:42:37.102701 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 17 13:42:37.104171 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 17 13:42:37.106616 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 17 13:42:37.108262 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 17 13:42:37.110027 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 17 13:42:37.111879 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 17 13:42:37.114601 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 17 13:42:37.115583 systemd[1]: Stopped target swap.target - Swaps. Jan 17 13:42:37.117059 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 17 13:42:37.117236 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 17 13:42:37.119293 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 17 13:42:37.120424 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 17 13:42:37.122049 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 17 13:42:37.122234 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 17 13:42:37.123752 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 17 13:42:37.123940 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 17 13:42:37.125924 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 17 13:42:37.126215 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 17 13:42:37.128293 systemd[1]: ignition-files.service: Deactivated successfully. Jan 17 13:42:37.128528 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 17 13:42:37.135682 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 17 13:42:37.138702 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 17 13:42:37.139632 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 17 13:42:37.139912 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 17 13:42:37.143552 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 17 13:42:37.143744 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 17 13:42:37.154007 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 17 13:42:37.154233 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 17 13:42:37.172278 ignition[1004]: INFO : Ignition 2.19.0 Jan 17 13:42:37.180976 ignition[1004]: INFO : Stage: umount Jan 17 13:42:37.180976 ignition[1004]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 17 13:42:37.180976 ignition[1004]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Jan 17 13:42:37.180976 ignition[1004]: INFO : umount: umount passed Jan 17 13:42:37.180976 ignition[1004]: INFO : Ignition finished successfully Jan 17 13:42:37.179297 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 17 13:42:37.183015 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 17 13:42:37.183161 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 17 13:42:37.187418 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 17 13:42:37.187542 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 17 13:42:37.188313 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 17 13:42:37.190691 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 17 13:42:37.191837 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 17 13:42:37.191915 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 17 13:42:37.193566 systemd[1]: Stopped target network.target - Network. Jan 17 13:42:37.194235 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 17 13:42:37.194312 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 17 13:42:37.195123 systemd[1]: Stopped target paths.target - Path Units. Jan 17 13:42:37.198658 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 17 13:42:37.202535 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 17 13:42:37.203275 systemd[1]: Stopped target slices.target - Slice Units. Jan 17 13:42:37.203936 systemd[1]: Stopped target sockets.target - Socket Units. Jan 17 13:42:37.206603 systemd[1]: iscsid.socket: Deactivated successfully. Jan 17 13:42:37.206679 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 17 13:42:37.207503 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 17 13:42:37.207567 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 17 13:42:37.208289 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 17 13:42:37.210455 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 17 13:42:37.211440 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 17 13:42:37.211508 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 17 13:42:37.214038 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 17 13:42:37.216471 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 17 13:42:37.218579 systemd-networkd[765]: eth0: DHCPv6 lease lost Jan 17 13:42:37.221737 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 17 13:42:37.221937 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 17 13:42:37.226925 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 17 13:42:37.227113 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 17 13:42:37.230973 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 17 13:42:37.231245 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 17 13:42:37.239620 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 17 13:42:37.240474 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 17 13:42:37.240555 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 17 13:42:37.242029 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 17 13:42:37.242108 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 17 13:42:37.245158 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 17 13:42:37.245230 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 17 13:42:37.247737 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 17 13:42:37.247805 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 17 13:42:37.249542 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 17 13:42:37.262088 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 17 13:42:37.262799 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 17 13:42:37.264725 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 17 13:42:37.264895 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 17 13:42:37.267019 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 17 13:42:37.267130 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 17 13:42:37.268579 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 17 13:42:37.268638 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 17 13:42:37.270177 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 17 13:42:37.270248 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 17 13:42:37.271897 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 17 13:42:37.271964 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 17 13:42:37.274659 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 17 13:42:37.274755 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 17 13:42:37.281628 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 17 13:42:37.282418 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 17 13:42:37.282508 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 17 13:42:37.286494 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 17 13:42:37.286572 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 17 13:42:37.288296 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 17 13:42:37.288363 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 17 13:42:37.292504 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 17 13:42:37.292575 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 17 13:42:37.295797 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 17 13:42:37.295953 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 17 13:42:37.321532 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 17 13:42:37.321788 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 17 13:42:37.323973 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 17 13:42:37.324836 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 17 13:42:37.324931 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 17 13:42:37.330672 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 17 13:42:37.343892 systemd[1]: Switching root. Jan 17 13:42:37.386052 systemd-journald[201]: Journal stopped Jan 17 13:42:38.876894 systemd-journald[201]: Received SIGTERM from PID 1 (systemd). Jan 17 13:42:38.877109 kernel: SELinux: policy capability network_peer_controls=1 Jan 17 13:42:38.877161 kernel: SELinux: policy capability open_perms=1 Jan 17 13:42:38.877188 kernel: SELinux: policy capability extended_socket_class=1 Jan 17 13:42:38.877214 kernel: SELinux: policy capability always_check_network=0 Jan 17 13:42:38.877239 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 17 13:42:38.877276 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 17 13:42:38.877317 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 17 13:42:38.877366 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 17 13:42:38.880057 kernel: audit: type=1403 audit(1737121357.616:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 17 13:42:38.880180 systemd[1]: Successfully loaded SELinux policy in 49.998ms. Jan 17 13:42:38.880285 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 23.318ms. Jan 17 13:42:38.880316 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 17 13:42:38.880354 systemd[1]: Detected virtualization kvm. Jan 17 13:42:38.881723 systemd[1]: Detected architecture x86-64. Jan 17 13:42:38.881781 systemd[1]: Detected first boot. Jan 17 13:42:38.881823 systemd[1]: Hostname set to . Jan 17 13:42:38.881849 systemd[1]: Initializing machine ID from VM UUID. Jan 17 13:42:38.881888 zram_generator::config[1046]: No configuration found. Jan 17 13:42:38.881929 systemd[1]: Populated /etc with preset unit settings. Jan 17 13:42:38.881964 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 17 13:42:38.882000 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 17 13:42:38.882027 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 17 13:42:38.882066 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 17 13:42:38.882088 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 17 13:42:38.882120 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 17 13:42:38.882141 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 17 13:42:38.882174 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 17 13:42:38.882202 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 17 13:42:38.882236 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 17 13:42:38.882264 systemd[1]: Created slice user.slice - User and Session Slice. Jan 17 13:42:38.882306 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 17 13:42:38.882351 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 17 13:42:38.882390 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 17 13:42:38.882419 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 17 13:42:38.882451 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 17 13:42:38.882488 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 17 13:42:38.882510 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 17 13:42:38.882530 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 17 13:42:38.882551 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 17 13:42:38.882583 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 17 13:42:38.882623 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 17 13:42:38.882645 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 17 13:42:38.882680 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 17 13:42:38.882710 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 17 13:42:38.882745 systemd[1]: Reached target slices.target - Slice Units. Jan 17 13:42:38.882772 systemd[1]: Reached target swap.target - Swaps. Jan 17 13:42:38.882805 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 17 13:42:38.882827 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 17 13:42:38.882854 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 17 13:42:38.882887 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 17 13:42:38.882929 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 17 13:42:38.882951 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 17 13:42:38.882983 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 17 13:42:38.883006 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 17 13:42:38.883026 systemd[1]: Mounting media.mount - External Media Directory... Jan 17 13:42:38.883062 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:38.883090 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 17 13:42:38.883111 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 17 13:42:38.883131 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 17 13:42:38.883151 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 17 13:42:38.883195 systemd[1]: Reached target machines.target - Containers. Jan 17 13:42:38.883224 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 17 13:42:38.883245 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 17 13:42:38.883271 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 17 13:42:38.883291 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 17 13:42:38.883318 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 17 13:42:38.883355 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 17 13:42:38.890103 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 17 13:42:38.890148 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 17 13:42:38.890172 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 17 13:42:38.890193 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 17 13:42:38.890220 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 17 13:42:38.890242 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 17 13:42:38.890262 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 17 13:42:38.890294 systemd[1]: Stopped systemd-fsck-usr.service. Jan 17 13:42:38.890332 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 17 13:42:38.890356 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 17 13:42:38.890407 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 17 13:42:38.890430 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 17 13:42:38.890451 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 17 13:42:38.890480 systemd[1]: verity-setup.service: Deactivated successfully. Jan 17 13:42:38.890508 systemd[1]: Stopped verity-setup.service. Jan 17 13:42:38.890539 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:38.890560 kernel: fuse: init (API version 7.39) Jan 17 13:42:38.890586 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 17 13:42:38.890613 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 17 13:42:38.890647 systemd[1]: Mounted media.mount - External Media Directory. Jan 17 13:42:38.890669 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 17 13:42:38.890690 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 17 13:42:38.890749 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 17 13:42:38.890789 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 17 13:42:38.890812 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 17 13:42:38.890884 systemd-journald[1139]: Collecting audit messages is disabled. Jan 17 13:42:38.890954 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 17 13:42:38.890984 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 17 13:42:38.891017 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 17 13:42:38.891055 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 17 13:42:38.891078 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 17 13:42:38.891098 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 17 13:42:38.891118 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 17 13:42:38.891138 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 17 13:42:38.891159 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 17 13:42:38.891178 kernel: loop: module loaded Jan 17 13:42:38.891206 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 17 13:42:38.891247 systemd-journald[1139]: Journal started Jan 17 13:42:38.891291 systemd-journald[1139]: Runtime Journal (/run/log/journal/0560f9da3e38428d97c85261f836974d) is 4.7M, max 38.0M, 33.2M free. Jan 17 13:42:38.896138 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 17 13:42:38.431507 systemd[1]: Queued start job for default target multi-user.target. Jan 17 13:42:38.898499 systemd[1]: Started systemd-journald.service - Journal Service. Jan 17 13:42:38.453610 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jan 17 13:42:38.454255 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 17 13:42:38.905826 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 17 13:42:38.906100 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 17 13:42:38.922307 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 17 13:42:38.931448 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 17 13:42:38.932388 kernel: ACPI: bus type drm_connector registered Jan 17 13:42:38.942943 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 17 13:42:38.944827 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 17 13:42:38.944889 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 17 13:42:38.949072 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 17 13:42:38.959444 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 17 13:42:38.962533 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 17 13:42:38.965186 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 17 13:42:38.976582 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 17 13:42:38.979479 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 17 13:42:38.982526 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 17 13:42:38.985684 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 17 13:42:38.986565 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 17 13:42:38.988227 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 17 13:42:38.994576 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 17 13:42:39.000557 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 17 13:42:39.003924 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 17 13:42:39.005482 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 17 13:42:39.007692 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 17 13:42:39.008843 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 17 13:42:39.011874 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 17 13:42:39.047478 systemd-journald[1139]: Time spent on flushing to /var/log/journal/0560f9da3e38428d97c85261f836974d is 70.793ms for 1140 entries. Jan 17 13:42:39.047478 systemd-journald[1139]: System Journal (/var/log/journal/0560f9da3e38428d97c85261f836974d) is 8.0M, max 584.8M, 576.8M free. Jan 17 13:42:39.164146 systemd-journald[1139]: Received client request to flush runtime journal. Jan 17 13:42:39.164234 kernel: loop0: detected capacity change from 0 to 210664 Jan 17 13:42:39.164277 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 17 13:42:39.089752 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 17 13:42:39.090835 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 17 13:42:39.104664 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 17 13:42:39.175480 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 17 13:42:39.177512 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 17 13:42:39.194338 systemd-tmpfiles[1179]: ACLs are not supported, ignoring. Jan 17 13:42:39.195450 systemd-tmpfiles[1179]: ACLs are not supported, ignoring. Jan 17 13:42:39.196730 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 17 13:42:39.198911 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 17 13:42:39.203604 kernel: loop1: detected capacity change from 0 to 8 Jan 17 13:42:39.222620 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 17 13:42:39.241789 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 17 13:42:39.249757 kernel: loop2: detected capacity change from 0 to 140768 Jan 17 13:42:39.290772 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 17 13:42:39.301980 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 17 13:42:39.337602 kernel: loop3: detected capacity change from 0 to 142488 Jan 17 13:42:39.338705 udevadm[1201]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Jan 17 13:42:39.346209 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 17 13:42:39.358540 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 17 13:42:39.397405 kernel: loop4: detected capacity change from 0 to 210664 Jan 17 13:42:39.396993 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. Jan 17 13:42:39.397023 systemd-tmpfiles[1204]: ACLs are not supported, ignoring. Jan 17 13:42:39.406827 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 17 13:42:39.422393 kernel: loop5: detected capacity change from 0 to 8 Jan 17 13:42:39.427386 kernel: loop6: detected capacity change from 0 to 140768 Jan 17 13:42:39.472452 kernel: loop7: detected capacity change from 0 to 142488 Jan 17 13:42:39.514390 (sd-merge)[1208]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. Jan 17 13:42:39.516619 (sd-merge)[1208]: Merged extensions into '/usr'. Jan 17 13:42:39.528337 systemd[1]: Reloading requested from client PID 1178 ('systemd-sysext') (unit systemd-sysext.service)... Jan 17 13:42:39.528395 systemd[1]: Reloading... Jan 17 13:42:39.738581 zram_generator::config[1235]: No configuration found. Jan 17 13:42:39.787460 ldconfig[1173]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 17 13:42:39.919578 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 17 13:42:39.988125 systemd[1]: Reloading finished in 457 ms. Jan 17 13:42:40.014967 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 17 13:42:40.021279 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 17 13:42:40.034636 systemd[1]: Starting ensure-sysext.service... Jan 17 13:42:40.041820 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 17 13:42:40.050701 systemd[1]: Reloading requested from client PID 1291 ('systemctl') (unit ensure-sysext.service)... Jan 17 13:42:40.050731 systemd[1]: Reloading... Jan 17 13:42:40.105429 systemd-tmpfiles[1292]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 17 13:42:40.112037 systemd-tmpfiles[1292]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 17 13:42:40.116236 systemd-tmpfiles[1292]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 17 13:42:40.120911 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Jan 17 13:42:40.123535 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. Jan 17 13:42:40.133395 zram_generator::config[1315]: No configuration found. Jan 17 13:42:40.144680 systemd-tmpfiles[1292]: Detected autofs mount point /boot during canonicalization of boot. Jan 17 13:42:40.147409 systemd-tmpfiles[1292]: Skipping /boot Jan 17 13:42:40.190907 systemd-tmpfiles[1292]: Detected autofs mount point /boot during canonicalization of boot. Jan 17 13:42:40.190939 systemd-tmpfiles[1292]: Skipping /boot Jan 17 13:42:40.363227 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 17 13:42:40.431799 systemd[1]: Reloading finished in 380 ms. Jan 17 13:42:40.458709 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 17 13:42:40.465072 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 17 13:42:40.478616 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 17 13:42:40.482753 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 17 13:42:40.488349 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 17 13:42:40.494577 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 17 13:42:40.499531 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 17 13:42:40.507576 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 17 13:42:40.517649 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:40.517951 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 17 13:42:40.531239 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 17 13:42:40.541287 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 17 13:42:40.553699 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 17 13:42:40.555416 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 17 13:42:40.555600 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:40.561102 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:40.562516 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 17 13:42:40.562805 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 17 13:42:40.562963 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:40.568203 systemd-udevd[1381]: Using default interface naming scheme 'v255'. Jan 17 13:42:40.569507 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:40.569829 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 17 13:42:40.581850 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 17 13:42:40.582917 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 17 13:42:40.583138 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 17 13:42:40.584455 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 17 13:42:40.585537 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 17 13:42:40.589778 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 17 13:42:40.592516 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 17 13:42:40.600971 systemd[1]: Finished ensure-sysext.service. Jan 17 13:42:40.610950 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 17 13:42:40.612290 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 17 13:42:40.621579 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 17 13:42:40.629534 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 17 13:42:40.639829 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 17 13:42:40.642795 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 17 13:42:40.645955 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 17 13:42:40.646207 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 17 13:42:40.650940 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 17 13:42:40.661637 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 17 13:42:40.663843 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 17 13:42:40.675467 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 17 13:42:40.685634 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 17 13:42:40.711025 augenrules[1418]: No rules Jan 17 13:42:40.718651 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 17 13:42:40.726926 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 17 13:42:40.730972 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 17 13:42:40.733071 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 17 13:42:40.754206 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 17 13:42:40.960286 systemd-resolved[1380]: Positive Trust Anchors: Jan 17 13:42:40.960331 systemd-resolved[1380]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 17 13:42:40.960395 systemd-resolved[1380]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 17 13:42:40.969197 systemd-resolved[1380]: Using system hostname 'srv-so9hk.gb1.brightbox.com'. Jan 17 13:42:40.971770 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 17 13:42:40.972770 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 17 13:42:40.977316 systemd-networkd[1410]: lo: Link UP Jan 17 13:42:40.977334 systemd-networkd[1410]: lo: Gained carrier Jan 17 13:42:40.982990 systemd-networkd[1410]: Enumeration completed Jan 17 13:42:40.984159 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 17 13:42:40.984841 systemd-networkd[1410]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 17 13:42:40.985051 systemd[1]: Reached target network.target - Network. Jan 17 13:42:40.985414 systemd-networkd[1410]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 17 13:42:40.987708 systemd-networkd[1410]: eth0: Link UP Jan 17 13:42:40.989422 systemd-networkd[1410]: eth0: Gained carrier Jan 17 13:42:40.989535 systemd-networkd[1410]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 17 13:42:40.994619 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 17 13:42:40.999328 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 17 13:42:41.000648 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jan 17 13:42:41.000717 systemd[1]: Reached target time-set.target - System Time Set. Jan 17 13:42:41.009475 systemd-networkd[1410]: eth0: DHCPv4 address 10.230.9.254/30, gateway 10.230.9.253 acquired from 10.230.9.253 Jan 17 13:42:41.010974 systemd-timesyncd[1401]: Network configuration changed, trying to establish connection. Jan 17 13:42:41.031724 systemd-networkd[1410]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jan 17 13:42:41.045413 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1438) Jan 17 13:42:41.078455 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jan 17 13:42:41.090162 kernel: mousedev: PS/2 mouse device common for all mice Jan 17 13:42:41.099415 kernel: ACPI: button: Power Button [PWRF] Jan 17 13:42:41.163400 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Jan 17 13:42:41.184396 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jan 17 13:42:41.195648 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Jan 17 13:42:41.195955 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jan 17 13:42:41.212680 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 17 13:42:41.225050 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jan 17 13:42:41.240805 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 17 13:42:41.260790 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 17 13:42:41.394742 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 17 13:42:41.459825 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 17 13:42:41.465650 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 17 13:42:41.486492 lvm[1463]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 17 13:42:41.518870 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 17 13:42:41.520721 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 17 13:42:41.521571 systemd[1]: Reached target sysinit.target - System Initialization. Jan 17 13:42:41.522465 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 17 13:42:41.523309 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 17 13:42:41.524553 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 17 13:42:41.525458 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 17 13:42:41.526284 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 17 13:42:41.527115 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 17 13:42:41.527169 systemd[1]: Reached target paths.target - Path Units. Jan 17 13:42:41.527884 systemd[1]: Reached target timers.target - Timer Units. Jan 17 13:42:41.529995 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 17 13:42:41.532608 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 17 13:42:41.538579 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 17 13:42:41.541180 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 17 13:42:41.542692 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 17 13:42:41.543575 systemd[1]: Reached target sockets.target - Socket Units. Jan 17 13:42:41.544245 systemd[1]: Reached target basic.target - Basic System. Jan 17 13:42:41.545013 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 17 13:42:41.545072 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 17 13:42:41.551539 systemd[1]: Starting containerd.service - containerd container runtime... Jan 17 13:42:41.555680 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 17 13:42:41.561237 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 17 13:42:41.563414 lvm[1467]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 17 13:42:41.570476 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 17 13:42:41.576590 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 17 13:42:41.577516 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 17 13:42:41.584564 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 17 13:42:41.600001 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 17 13:42:41.606567 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 17 13:42:41.614587 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 17 13:42:41.631591 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 17 13:42:41.633307 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 17 13:42:41.634053 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 17 13:42:41.638699 systemd[1]: Starting update-engine.service - Update Engine... Jan 17 13:42:41.640351 jq[1471]: false Jan 17 13:42:41.641506 dbus-daemon[1470]: [system] SELinux support is enabled Jan 17 13:42:41.645636 dbus-daemon[1470]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1410 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Jan 17 13:42:41.650453 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 17 13:42:41.653547 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 17 13:42:41.662057 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 17 13:42:41.674058 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 17 13:42:41.675688 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 17 13:42:41.681382 extend-filesystems[1472]: Found loop4 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found loop5 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found loop6 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found loop7 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda1 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda2 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda3 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found usr Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda4 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda6 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda7 Jan 17 13:42:41.681382 extend-filesystems[1472]: Found vda9 Jan 17 13:42:41.681382 extend-filesystems[1472]: Checking size of /dev/vda9 Jan 17 13:42:41.695803 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 17 13:42:41.699392 dbus-daemon[1470]: [system] Successfully activated service 'org.freedesktop.systemd1' Jan 17 13:42:41.695856 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 17 13:42:41.705626 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 17 13:42:41.705672 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 17 13:42:41.717431 (ntainerd)[1493]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 17 13:42:41.728588 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Jan 17 13:42:41.730096 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 17 13:42:41.731475 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 17 13:42:41.733292 jq[1484]: true Jan 17 13:42:41.748153 systemd[1]: motdgen.service: Deactivated successfully. Jan 17 13:42:41.749539 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 17 13:42:41.763682 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 17 13:42:41.771701 tar[1492]: linux-amd64/helm Jan 17 13:42:41.776203 extend-filesystems[1472]: Resized partition /dev/vda9 Jan 17 13:42:41.784410 extend-filesystems[1511]: resize2fs 1.47.1 (20-May-2024) Jan 17 13:42:41.814392 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 15121403 blocks Jan 17 13:42:41.821407 update_engine[1482]: I20250117 13:42:41.819733 1482 main.cc:92] Flatcar Update Engine starting Jan 17 13:42:41.824571 jq[1505]: true Jan 17 13:42:41.830076 update_engine[1482]: I20250117 13:42:41.825592 1482 update_check_scheduler.cc:74] Next update check in 5m6s Jan 17 13:42:41.826510 systemd[1]: Started update-engine.service - Update Engine. Jan 17 13:42:41.835829 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 17 13:42:41.849427 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (1419) Jan 17 13:42:41.996512 locksmithd[1512]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 17 13:42:42.104312 systemd-logind[1480]: Watching system buttons on /dev/input/event2 (Power Button) Jan 17 13:42:42.104349 systemd-logind[1480]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 17 13:42:42.107795 systemd-logind[1480]: New seat seat0. Jan 17 13:42:42.111917 systemd[1]: Started systemd-logind.service - User Login Management. Jan 17 13:42:42.121520 bash[1537]: Updated "/home/core/.ssh/authorized_keys" Jan 17 13:42:42.134453 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 17 13:42:42.915460 systemd-timesyncd[1401]: Contacted time server 178.62.250.107:123 (0.flatcar.pool.ntp.org). Jan 17 13:42:42.915551 systemd-timesyncd[1401]: Initial clock synchronization to Fri 2025-01-17 13:42:42.915247 UTC. Jan 17 13:42:42.917873 dbus-daemon[1470]: [system] Successfully activated service 'org.freedesktop.hostname1' Jan 17 13:42:42.915623 systemd-resolved[1380]: Clock change detected. Flushing caches. Jan 17 13:42:42.918487 dbus-daemon[1470]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.8' (uid=0 pid=1503 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Jan 17 13:42:42.918553 systemd[1]: Starting sshkeys.service... Jan 17 13:42:42.925146 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Jan 17 13:42:42.939512 systemd[1]: Starting polkit.service - Authorization Manager... Jan 17 13:42:42.949977 kernel: EXT4-fs (vda9): resized filesystem to 15121403 Jan 17 13:42:42.967343 extend-filesystems[1511]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jan 17 13:42:42.967343 extend-filesystems[1511]: old_desc_blocks = 1, new_desc_blocks = 8 Jan 17 13:42:42.967343 extend-filesystems[1511]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. Jan 17 13:42:42.975244 extend-filesystems[1472]: Resized filesystem in /dev/vda9 Jan 17 13:42:42.970611 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 17 13:42:42.971034 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 17 13:42:42.990529 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 17 13:42:42.997974 polkitd[1540]: Started polkitd version 121 Jan 17 13:42:42.998582 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 17 13:42:43.014105 polkitd[1540]: Loading rules from directory /etc/polkit-1/rules.d Jan 17 13:42:43.014596 polkitd[1540]: Loading rules from directory /usr/share/polkit-1/rules.d Jan 17 13:42:43.021222 polkitd[1540]: Finished loading, compiling and executing 2 rules Jan 17 13:42:43.024124 dbus-daemon[1470]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Jan 17 13:42:43.024864 systemd[1]: Started polkit.service - Authorization Manager. Jan 17 13:42:43.027245 polkitd[1540]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Jan 17 13:42:43.062837 systemd-networkd[1410]: eth0: Gained IPv6LL Jan 17 13:42:43.068540 containerd[1493]: time="2025-01-17T13:42:43.066596398Z" level=info msg="starting containerd" revision=174e0d1785eeda18dc2beba45e1d5a188771636b version=v1.7.21 Jan 17 13:42:43.070032 systemd-hostnamed[1503]: Hostname set to (static) Jan 17 13:42:43.074270 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 17 13:42:43.077225 systemd[1]: Reached target network-online.target - Network is Online. Jan 17 13:42:43.088536 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:42:43.096367 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 17 13:42:43.179486 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 17 13:42:43.185670 containerd[1493]: time="2025-01-17T13:42:43.185623165Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.196266 containerd[1493]: time="2025-01-17T13:42:43.196222960Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.196753409Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.196798716Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197079516Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197122649Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197252307Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197275780Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197520911Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197554241Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197589865Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197605093Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.199542 containerd[1493]: time="2025-01-17T13:42:43.197783607Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.201384 containerd[1493]: time="2025-01-17T13:42:43.199576491Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 17 13:42:43.201384 containerd[1493]: time="2025-01-17T13:42:43.199738082Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 17 13:42:43.201384 containerd[1493]: time="2025-01-17T13:42:43.199773811Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 17 13:42:43.201384 containerd[1493]: time="2025-01-17T13:42:43.199911639Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 17 13:42:43.201384 containerd[1493]: time="2025-01-17T13:42:43.200011815Z" level=info msg="metadata content store policy set" policy=shared Jan 17 13:42:43.207658 containerd[1493]: time="2025-01-17T13:42:43.207324077Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 17 13:42:43.207658 containerd[1493]: time="2025-01-17T13:42:43.207409928Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 17 13:42:43.207658 containerd[1493]: time="2025-01-17T13:42:43.207438013Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 17 13:42:43.207658 containerd[1493]: time="2025-01-17T13:42:43.207463206Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 17 13:42:43.207658 containerd[1493]: time="2025-01-17T13:42:43.207494006Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 17 13:42:43.207898 containerd[1493]: time="2025-01-17T13:42:43.207678957Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208024664Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208240725Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208267913Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208288436Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208308523Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208329080Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208349241Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208378388Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208399454Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208419067Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208439014Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208457064Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208496875Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210336 containerd[1493]: time="2025-01-17T13:42:43.208521819Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208542142Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208576428Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208596803Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208623379Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208644264Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208674699Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208693704Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208715720Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208744924Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208764861Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208783382Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208817200Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208902346Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208947504Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.210827 containerd[1493]: time="2025-01-17T13:42:43.208977193Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 17 13:42:43.211385 containerd[1493]: time="2025-01-17T13:42:43.209116600Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 17 13:42:43.211385 containerd[1493]: time="2025-01-17T13:42:43.209155899Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 17 13:42:43.214801 containerd[1493]: time="2025-01-17T13:42:43.209175106Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 17 13:42:43.214801 containerd[1493]: time="2025-01-17T13:42:43.212311388Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 17 13:42:43.214801 containerd[1493]: time="2025-01-17T13:42:43.212332799Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.214801 containerd[1493]: time="2025-01-17T13:42:43.212383760Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 17 13:42:43.214801 containerd[1493]: time="2025-01-17T13:42:43.212410760Z" level=info msg="NRI interface is disabled by configuration." Jan 17 13:42:43.214801 containerd[1493]: time="2025-01-17T13:42:43.212460443Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 17 13:42:43.215080 containerd[1493]: time="2025-01-17T13:42:43.213827436Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 17 13:42:43.215080 containerd[1493]: time="2025-01-17T13:42:43.213960659Z" level=info msg="Connect containerd service" Jan 17 13:42:43.215080 containerd[1493]: time="2025-01-17T13:42:43.214417381Z" level=info msg="using legacy CRI server" Jan 17 13:42:43.215080 containerd[1493]: time="2025-01-17T13:42:43.214442527Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 17 13:42:43.215933 containerd[1493]: time="2025-01-17T13:42:43.214768896Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.218228045Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.219471611Z" level=info msg="Start subscribing containerd event" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.219589289Z" level=info msg="Start recovering state" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.219714600Z" level=info msg="Start event monitor" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.219763574Z" level=info msg="Start snapshots syncer" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.219788870Z" level=info msg="Start cni network conf syncer for default" Jan 17 13:42:43.220863 containerd[1493]: time="2025-01-17T13:42:43.219810297Z" level=info msg="Start streaming server" Jan 17 13:42:43.221161 containerd[1493]: time="2025-01-17T13:42:43.220955240Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 17 13:42:43.222064 containerd[1493]: time="2025-01-17T13:42:43.221338329Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 17 13:42:43.226670 containerd[1493]: time="2025-01-17T13:42:43.223491947Z" level=info msg="containerd successfully booted in 0.160907s" Jan 17 13:42:43.224346 systemd[1]: Started containerd.service - containerd container runtime. Jan 17 13:42:43.631261 systemd-networkd[1410]: eth0: Ignoring DHCPv6 address 2a02:1348:179:827f:24:19ff:fee6:9fe/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:827f:24:19ff:fee6:9fe/64 assigned by NDisc. Jan 17 13:42:43.631268 systemd-networkd[1410]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. Jan 17 13:42:43.649925 sshd_keygen[1499]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 17 13:42:43.687754 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 17 13:42:43.698646 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 17 13:42:43.702958 systemd[1]: Started sshd@0-10.230.9.254:22-139.178.68.195:42634.service - OpenSSH per-connection server daemon (139.178.68.195:42634). Jan 17 13:42:43.729730 systemd[1]: issuegen.service: Deactivated successfully. Jan 17 13:42:43.730922 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 17 13:42:43.744681 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 17 13:42:43.769228 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 17 13:42:43.779879 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 17 13:42:43.788293 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 17 13:42:43.791610 systemd[1]: Reached target getty.target - Login Prompts. Jan 17 13:42:43.813314 tar[1492]: linux-amd64/LICENSE Jan 17 13:42:43.813857 tar[1492]: linux-amd64/README.md Jan 17 13:42:43.828768 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 17 13:42:44.184378 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:42:44.191054 (kubelet)[1598]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 17 13:42:44.604887 sshd[1580]: Accepted publickey for core from 139.178.68.195 port 42634 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:42:44.608571 sshd[1580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:42:44.626287 systemd-logind[1480]: New session 1 of user core. Jan 17 13:42:44.628952 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 17 13:42:44.644678 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 17 13:42:44.672749 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 17 13:42:44.683843 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 17 13:42:44.701956 (systemd)[1606]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 17 13:42:44.842864 systemd[1606]: Queued start job for default target default.target. Jan 17 13:42:44.851158 systemd[1606]: Created slice app.slice - User Application Slice. Jan 17 13:42:44.851428 systemd[1606]: Reached target paths.target - Paths. Jan 17 13:42:44.851452 systemd[1606]: Reached target timers.target - Timers. Jan 17 13:42:44.855353 systemd[1606]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 17 13:42:44.868703 kubelet[1598]: E0117 13:42:44.868639 1598 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 17 13:42:44.872044 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 17 13:42:44.872331 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 17 13:42:44.872939 systemd[1]: kubelet.service: Consumed 1.021s CPU time. Jan 17 13:42:44.882522 systemd[1606]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 17 13:42:44.882718 systemd[1606]: Reached target sockets.target - Sockets. Jan 17 13:42:44.882752 systemd[1606]: Reached target basic.target - Basic System. Jan 17 13:42:44.882820 systemd[1606]: Reached target default.target - Main User Target. Jan 17 13:42:44.882893 systemd[1606]: Startup finished in 171ms. Jan 17 13:42:44.883327 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 17 13:42:44.900557 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 17 13:42:45.533799 systemd[1]: Started sshd@1-10.230.9.254:22-139.178.68.195:57084.service - OpenSSH per-connection server daemon (139.178.68.195:57084). Jan 17 13:42:46.420767 sshd[1620]: Accepted publickey for core from 139.178.68.195 port 57084 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:42:46.423814 sshd[1620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:42:46.431160 systemd-logind[1480]: New session 2 of user core. Jan 17 13:42:46.443530 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 17 13:42:47.041697 sshd[1620]: pam_unix(sshd:session): session closed for user core Jan 17 13:42:47.046736 systemd[1]: sshd@1-10.230.9.254:22-139.178.68.195:57084.service: Deactivated successfully. Jan 17 13:42:47.049409 systemd[1]: session-2.scope: Deactivated successfully. Jan 17 13:42:47.050718 systemd-logind[1480]: Session 2 logged out. Waiting for processes to exit. Jan 17 13:42:47.052127 systemd-logind[1480]: Removed session 2. Jan 17 13:42:47.208909 systemd[1]: Started sshd@2-10.230.9.254:22-139.178.68.195:57100.service - OpenSSH per-connection server daemon (139.178.68.195:57100). Jan 17 13:42:48.091758 sshd[1628]: Accepted publickey for core from 139.178.68.195 port 57100 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:42:48.093862 sshd[1628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:42:48.101456 systemd-logind[1480]: New session 3 of user core. Jan 17 13:42:48.108701 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 17 13:42:48.712057 sshd[1628]: pam_unix(sshd:session): session closed for user core Jan 17 13:42:48.716872 systemd[1]: sshd@2-10.230.9.254:22-139.178.68.195:57100.service: Deactivated successfully. Jan 17 13:42:48.719098 systemd[1]: session-3.scope: Deactivated successfully. Jan 17 13:42:48.720132 systemd-logind[1480]: Session 3 logged out. Waiting for processes to exit. Jan 17 13:42:48.721726 systemd-logind[1480]: Removed session 3. Jan 17 13:42:48.851963 login[1587]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 17 13:42:48.853915 login[1588]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 17 13:42:48.861312 systemd-logind[1480]: New session 4 of user core. Jan 17 13:42:48.873506 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 17 13:42:48.876692 systemd-logind[1480]: New session 5 of user core. Jan 17 13:42:48.878920 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 17 13:42:49.530695 coreos-metadata[1469]: Jan 17 13:42:49.530 WARN failed to locate config-drive, using the metadata service API instead Jan 17 13:42:49.557357 coreos-metadata[1469]: Jan 17 13:42:49.557 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Jan 17 13:42:49.564781 coreos-metadata[1469]: Jan 17 13:42:49.564 INFO Fetch failed with 404: resource not found Jan 17 13:42:49.564781 coreos-metadata[1469]: Jan 17 13:42:49.564 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Jan 17 13:42:49.565653 coreos-metadata[1469]: Jan 17 13:42:49.565 INFO Fetch successful Jan 17 13:42:49.565823 coreos-metadata[1469]: Jan 17 13:42:49.565 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Jan 17 13:42:49.579036 coreos-metadata[1469]: Jan 17 13:42:49.579 INFO Fetch successful Jan 17 13:42:49.579307 coreos-metadata[1469]: Jan 17 13:42:49.579 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Jan 17 13:42:49.595728 coreos-metadata[1469]: Jan 17 13:42:49.595 INFO Fetch successful Jan 17 13:42:49.595976 coreos-metadata[1469]: Jan 17 13:42:49.595 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Jan 17 13:42:49.615141 coreos-metadata[1469]: Jan 17 13:42:49.615 INFO Fetch successful Jan 17 13:42:49.615378 coreos-metadata[1469]: Jan 17 13:42:49.615 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Jan 17 13:42:49.631895 coreos-metadata[1469]: Jan 17 13:42:49.631 INFO Fetch successful Jan 17 13:42:49.662267 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 17 13:42:49.663345 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 17 13:42:50.137959 coreos-metadata[1544]: Jan 17 13:42:50.137 WARN failed to locate config-drive, using the metadata service API instead Jan 17 13:42:50.160178 coreos-metadata[1544]: Jan 17 13:42:50.160 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Jan 17 13:42:50.185237 coreos-metadata[1544]: Jan 17 13:42:50.185 INFO Fetch successful Jan 17 13:42:50.185237 coreos-metadata[1544]: Jan 17 13:42:50.185 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Jan 17 13:42:50.216397 coreos-metadata[1544]: Jan 17 13:42:50.216 INFO Fetch successful Jan 17 13:42:50.218680 unknown[1544]: wrote ssh authorized keys file for user: core Jan 17 13:42:50.246448 update-ssh-keys[1670]: Updated "/home/core/.ssh/authorized_keys" Jan 17 13:42:50.247489 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 17 13:42:50.250685 systemd[1]: Finished sshkeys.service. Jan 17 13:42:50.253689 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 17 13:42:50.254342 systemd[1]: Startup finished in 1.375s (kernel) + 14.860s (initrd) + 11.918s (userspace) = 28.154s. Jan 17 13:42:55.122113 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 17 13:42:55.133990 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:42:55.344320 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:42:55.354626 (kubelet)[1682]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 17 13:42:55.425764 kubelet[1682]: E0117 13:42:55.425545 1682 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 17 13:42:55.429986 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 17 13:42:55.430266 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 17 13:42:58.865101 systemd[1]: Started sshd@3-10.230.9.254:22-139.178.68.195:40208.service - OpenSSH per-connection server daemon (139.178.68.195:40208). Jan 17 13:42:59.758059 sshd[1692]: Accepted publickey for core from 139.178.68.195 port 40208 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:42:59.760317 sshd[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:42:59.768449 systemd-logind[1480]: New session 6 of user core. Jan 17 13:42:59.774403 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 17 13:43:00.375462 sshd[1692]: pam_unix(sshd:session): session closed for user core Jan 17 13:43:00.380515 systemd[1]: sshd@3-10.230.9.254:22-139.178.68.195:40208.service: Deactivated successfully. Jan 17 13:43:00.382740 systemd[1]: session-6.scope: Deactivated successfully. Jan 17 13:43:00.383755 systemd-logind[1480]: Session 6 logged out. Waiting for processes to exit. Jan 17 13:43:00.385227 systemd-logind[1480]: Removed session 6. Jan 17 13:43:00.535517 systemd[1]: Started sshd@4-10.230.9.254:22-139.178.68.195:40224.service - OpenSSH per-connection server daemon (139.178.68.195:40224). Jan 17 13:43:01.423320 sshd[1699]: Accepted publickey for core from 139.178.68.195 port 40224 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:43:01.425362 sshd[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:43:01.433255 systemd-logind[1480]: New session 7 of user core. Jan 17 13:43:01.438470 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 17 13:43:02.039983 sshd[1699]: pam_unix(sshd:session): session closed for user core Jan 17 13:43:02.044102 systemd[1]: sshd@4-10.230.9.254:22-139.178.68.195:40224.service: Deactivated successfully. Jan 17 13:43:02.046449 systemd[1]: session-7.scope: Deactivated successfully. Jan 17 13:43:02.048706 systemd-logind[1480]: Session 7 logged out. Waiting for processes to exit. Jan 17 13:43:02.050294 systemd-logind[1480]: Removed session 7. Jan 17 13:43:02.200563 systemd[1]: Started sshd@5-10.230.9.254:22-139.178.68.195:40226.service - OpenSSH per-connection server daemon (139.178.68.195:40226). Jan 17 13:43:03.103838 sshd[1706]: Accepted publickey for core from 139.178.68.195 port 40226 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:43:03.105899 sshd[1706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:43:03.112474 systemd-logind[1480]: New session 8 of user core. Jan 17 13:43:03.123481 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 17 13:43:03.723715 sshd[1706]: pam_unix(sshd:session): session closed for user core Jan 17 13:43:03.728581 systemd[1]: sshd@5-10.230.9.254:22-139.178.68.195:40226.service: Deactivated successfully. Jan 17 13:43:03.730639 systemd[1]: session-8.scope: Deactivated successfully. Jan 17 13:43:03.731453 systemd-logind[1480]: Session 8 logged out. Waiting for processes to exit. Jan 17 13:43:03.733125 systemd-logind[1480]: Removed session 8. Jan 17 13:43:03.886593 systemd[1]: Started sshd@6-10.230.9.254:22-139.178.68.195:40228.service - OpenSSH per-connection server daemon (139.178.68.195:40228). Jan 17 13:43:04.769652 sshd[1713]: Accepted publickey for core from 139.178.68.195 port 40228 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:43:04.771787 sshd[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:43:04.778258 systemd-logind[1480]: New session 9 of user core. Jan 17 13:43:04.786420 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 17 13:43:05.261536 sudo[1716]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 17 13:43:05.262067 sudo[1716]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 17 13:43:05.277727 sudo[1716]: pam_unix(sudo:session): session closed for user root Jan 17 13:43:05.421484 sshd[1713]: pam_unix(sshd:session): session closed for user core Jan 17 13:43:05.427859 systemd[1]: sshd@6-10.230.9.254:22-139.178.68.195:40228.service: Deactivated successfully. Jan 17 13:43:05.430920 systemd[1]: session-9.scope: Deactivated successfully. Jan 17 13:43:05.433061 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 17 13:43:05.433251 systemd-logind[1480]: Session 9 logged out. Waiting for processes to exit. Jan 17 13:43:05.441476 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:05.443096 systemd-logind[1480]: Removed session 9. Jan 17 13:43:05.591649 systemd[1]: Started sshd@7-10.230.9.254:22-139.178.68.195:57934.service - OpenSSH per-connection server daemon (139.178.68.195:57934). Jan 17 13:43:05.618450 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:05.624701 (kubelet)[1731]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 17 13:43:05.692900 kubelet[1731]: E0117 13:43:05.692790 1731 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 17 13:43:05.696140 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 17 13:43:05.696468 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 17 13:43:06.476724 sshd[1724]: Accepted publickey for core from 139.178.68.195 port 57934 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:43:06.478764 sshd[1724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:43:06.485349 systemd-logind[1480]: New session 10 of user core. Jan 17 13:43:06.496459 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 17 13:43:06.955032 sudo[1741]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 17 13:43:06.955579 sudo[1741]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 17 13:43:06.961019 sudo[1741]: pam_unix(sudo:session): session closed for user root Jan 17 13:43:06.968792 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Jan 17 13:43:06.969292 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 17 13:43:06.987510 systemd[1]: Stopping audit-rules.service - Load Security Auditing Rules... Jan 17 13:43:06.990355 auditctl[1744]: No rules Jan 17 13:43:06.990862 systemd[1]: audit-rules.service: Deactivated successfully. Jan 17 13:43:06.991169 systemd[1]: Stopped audit-rules.service - Load Security Auditing Rules. Jan 17 13:43:06.994361 systemd[1]: Starting audit-rules.service - Load Security Auditing Rules... Jan 17 13:43:07.048782 augenrules[1762]: No rules Jan 17 13:43:07.049739 systemd[1]: Finished audit-rules.service - Load Security Auditing Rules. Jan 17 13:43:07.052585 sudo[1740]: pam_unix(sudo:session): session closed for user root Jan 17 13:43:07.242360 sshd[1724]: pam_unix(sshd:session): session closed for user core Jan 17 13:43:07.246866 systemd[1]: sshd@7-10.230.9.254:22-139.178.68.195:57934.service: Deactivated successfully. Jan 17 13:43:07.248945 systemd[1]: session-10.scope: Deactivated successfully. Jan 17 13:43:07.250708 systemd-logind[1480]: Session 10 logged out. Waiting for processes to exit. Jan 17 13:43:07.252378 systemd-logind[1480]: Removed session 10. Jan 17 13:43:07.409788 systemd[1]: Started sshd@8-10.230.9.254:22-139.178.68.195:57936.service - OpenSSH per-connection server daemon (139.178.68.195:57936). Jan 17 13:43:08.293209 sshd[1770]: Accepted publickey for core from 139.178.68.195 port 57936 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:43:08.295243 sshd[1770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:43:08.301590 systemd-logind[1480]: New session 11 of user core. Jan 17 13:43:08.314549 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 17 13:43:08.771193 sudo[1773]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 17 13:43:08.771665 sudo[1773]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 17 13:43:09.239664 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 17 13:43:09.239901 (dockerd)[1788]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 17 13:43:09.682245 dockerd[1788]: time="2025-01-17T13:43:09.681904868Z" level=info msg="Starting up" Jan 17 13:43:09.812246 systemd[1]: var-lib-docker-metacopy\x2dcheck630540953-merged.mount: Deactivated successfully. Jan 17 13:43:09.836825 dockerd[1788]: time="2025-01-17T13:43:09.836698528Z" level=info msg="Loading containers: start." Jan 17 13:43:09.995228 kernel: Initializing XFRM netlink socket Jan 17 13:43:10.114328 systemd-networkd[1410]: docker0: Link UP Jan 17 13:43:10.136089 dockerd[1788]: time="2025-01-17T13:43:10.136037463Z" level=info msg="Loading containers: done." Jan 17 13:43:10.161146 dockerd[1788]: time="2025-01-17T13:43:10.161069803Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 17 13:43:10.161380 dockerd[1788]: time="2025-01-17T13:43:10.161239615Z" level=info msg="Docker daemon" commit=061aa95809be396a6b5542618d8a34b02a21ff77 containerd-snapshotter=false storage-driver=overlay2 version=26.1.0 Jan 17 13:43:10.161460 dockerd[1788]: time="2025-01-17T13:43:10.161423536Z" level=info msg="Daemon has completed initialization" Jan 17 13:43:10.197110 dockerd[1788]: time="2025-01-17T13:43:10.196951819Z" level=info msg="API listen on /run/docker.sock" Jan 17 13:43:10.197484 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 17 13:43:10.798797 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2490521849-merged.mount: Deactivated successfully. Jan 17 13:43:11.531307 containerd[1493]: time="2025-01-17T13:43:11.531167192Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.9\"" Jan 17 13:43:12.347204 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount685848484.mount: Deactivated successfully. Jan 17 13:43:13.685995 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Jan 17 13:43:14.346633 containerd[1493]: time="2025-01-17T13:43:14.346553140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:14.348230 containerd[1493]: time="2025-01-17T13:43:14.348146082Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.9: active requests=0, bytes read=32677020" Jan 17 13:43:14.349035 containerd[1493]: time="2025-01-17T13:43:14.348958461Z" level=info msg="ImageCreate event name:\"sha256:4f53be91109c4dd4658bb0141e8af556b94293ec9fad72b2b62a617edb48e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:14.353647 containerd[1493]: time="2025-01-17T13:43:14.353253092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:540de8f810ac963b8ed93f7393a8746d68e7e8a2c79ea58ff409ac5b9ca6a9fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:14.354631 containerd[1493]: time="2025-01-17T13:43:14.354588740Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.9\" with image id \"sha256:4f53be91109c4dd4658bb0141e8af556b94293ec9fad72b2b62a617edb48e5c4\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:540de8f810ac963b8ed93f7393a8746d68e7e8a2c79ea58ff409ac5b9ca6a9fc\", size \"32673812\" in 2.823270753s" Jan 17 13:43:14.354730 containerd[1493]: time="2025-01-17T13:43:14.354663558Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.9\" returns image reference \"sha256:4f53be91109c4dd4658bb0141e8af556b94293ec9fad72b2b62a617edb48e5c4\"" Jan 17 13:43:14.392525 containerd[1493]: time="2025-01-17T13:43:14.392472518Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.9\"" Jan 17 13:43:15.872820 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 17 13:43:15.883599 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:16.075469 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:16.077845 (kubelet)[2011]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 17 13:43:16.200340 kubelet[2011]: E0117 13:43:16.199376 2011 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 17 13:43:16.204309 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 17 13:43:16.204739 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 17 13:43:16.592259 containerd[1493]: time="2025-01-17T13:43:16.592035265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:16.594471 containerd[1493]: time="2025-01-17T13:43:16.594372168Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.9: active requests=0, bytes read=29605753" Jan 17 13:43:16.595103 containerd[1493]: time="2025-01-17T13:43:16.594539412Z" level=info msg="ImageCreate event name:\"sha256:d4203c1bb2593a7429c3df3c040da333190e5d7e01f377d0255b7b813ca09568\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:16.599979 containerd[1493]: time="2025-01-17T13:43:16.599246664Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6350693c04956b13db2519e01ca12a0bbe58466e9f12ef8617f1429da6081f43\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:16.600970 containerd[1493]: time="2025-01-17T13:43:16.600899807Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.9\" with image id \"sha256:d4203c1bb2593a7429c3df3c040da333190e5d7e01f377d0255b7b813ca09568\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6350693c04956b13db2519e01ca12a0bbe58466e9f12ef8617f1429da6081f43\", size \"31052327\" in 2.208168746s" Jan 17 13:43:16.601064 containerd[1493]: time="2025-01-17T13:43:16.601011074Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.9\" returns image reference \"sha256:d4203c1bb2593a7429c3df3c040da333190e5d7e01f377d0255b7b813ca09568\"" Jan 17 13:43:16.632765 containerd[1493]: time="2025-01-17T13:43:16.632717606Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.9\"" Jan 17 13:43:18.310084 containerd[1493]: time="2025-01-17T13:43:18.308445269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:18.310084 containerd[1493]: time="2025-01-17T13:43:18.310024152Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.9: active requests=0, bytes read=17783072" Jan 17 13:43:18.310799 containerd[1493]: time="2025-01-17T13:43:18.310762815Z" level=info msg="ImageCreate event name:\"sha256:41cce68b0c8c3c4862ff55ac17be57616cce36a04e719aee733e5c7c1a24b725\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:18.315046 containerd[1493]: time="2025-01-17T13:43:18.315010526Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:153efd6dc89e61a38ef273cf4c4cebd2bfee68082c2ee3d4fab5da94e4ae13d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:18.316733 containerd[1493]: time="2025-01-17T13:43:18.316689948Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.9\" with image id \"sha256:41cce68b0c8c3c4862ff55ac17be57616cce36a04e719aee733e5c7c1a24b725\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:153efd6dc89e61a38ef273cf4c4cebd2bfee68082c2ee3d4fab5da94e4ae13d3\", size \"19229664\" in 1.683669685s" Jan 17 13:43:18.316842 containerd[1493]: time="2025-01-17T13:43:18.316735231Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.9\" returns image reference \"sha256:41cce68b0c8c3c4862ff55ac17be57616cce36a04e719aee733e5c7c1a24b725\"" Jan 17 13:43:18.348417 containerd[1493]: time="2025-01-17T13:43:18.348333517Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.9\"" Jan 17 13:43:20.036641 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2965813412.mount: Deactivated successfully. Jan 17 13:43:20.656526 containerd[1493]: time="2025-01-17T13:43:20.656379795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:20.659305 containerd[1493]: time="2025-01-17T13:43:20.659235892Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.9: active requests=0, bytes read=29058345" Jan 17 13:43:20.660432 containerd[1493]: time="2025-01-17T13:43:20.660364169Z" level=info msg="ImageCreate event name:\"sha256:4c369683c359609256b8907f424fc2355f1e7e3eeb7295b1fd8ffc5304f4cede\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:20.663219 containerd[1493]: time="2025-01-17T13:43:20.663128248Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:d78dc40d97ff862fd8ddb47f80a5ba3feec17bc73e58a60e963885e33faa0083\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:20.664364 containerd[1493]: time="2025-01-17T13:43:20.664320798Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.9\" with image id \"sha256:4c369683c359609256b8907f424fc2355f1e7e3eeb7295b1fd8ffc5304f4cede\", repo tag \"registry.k8s.io/kube-proxy:v1.30.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:d78dc40d97ff862fd8ddb47f80a5ba3feec17bc73e58a60e963885e33faa0083\", size \"29057356\" in 2.315919362s" Jan 17 13:43:20.664451 containerd[1493]: time="2025-01-17T13:43:20.664367341Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.9\" returns image reference \"sha256:4c369683c359609256b8907f424fc2355f1e7e3eeb7295b1fd8ffc5304f4cede\"" Jan 17 13:43:20.695893 containerd[1493]: time="2025-01-17T13:43:20.695723084Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 17 13:43:21.312632 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3536905493.mount: Deactivated successfully. Jan 17 13:43:22.595221 containerd[1493]: time="2025-01-17T13:43:22.593406029Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:22.595221 containerd[1493]: time="2025-01-17T13:43:22.594385753Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185769" Jan 17 13:43:22.596559 containerd[1493]: time="2025-01-17T13:43:22.596523233Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:22.602335 containerd[1493]: time="2025-01-17T13:43:22.602278351Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:22.604084 containerd[1493]: time="2025-01-17T13:43:22.604038913Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.908243477s" Jan 17 13:43:22.604170 containerd[1493]: time="2025-01-17T13:43:22.604119655Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 17 13:43:22.639469 containerd[1493]: time="2025-01-17T13:43:22.639410501Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 17 13:43:23.204236 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1458096129.mount: Deactivated successfully. Jan 17 13:43:23.211572 containerd[1493]: time="2025-01-17T13:43:23.210391698Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:23.212850 containerd[1493]: time="2025-01-17T13:43:23.212797968Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322298" Jan 17 13:43:23.214055 containerd[1493]: time="2025-01-17T13:43:23.214011013Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:23.217224 containerd[1493]: time="2025-01-17T13:43:23.217143085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:23.218682 containerd[1493]: time="2025-01-17T13:43:23.218635595Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 579.161952ms" Jan 17 13:43:23.218876 containerd[1493]: time="2025-01-17T13:43:23.218847263Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Jan 17 13:43:23.251418 containerd[1493]: time="2025-01-17T13:43:23.251364020Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Jan 17 13:43:23.858788 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4150453077.mount: Deactivated successfully. Jan 17 13:43:26.372272 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 17 13:43:26.382519 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:26.706527 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:26.709255 (kubelet)[2153]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 17 13:43:26.842032 kubelet[2153]: E0117 13:43:26.840502 2153 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 17 13:43:26.843562 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 17 13:43:26.843832 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 17 13:43:27.916659 containerd[1493]: time="2025-01-17T13:43:27.916534402Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:27.919664 containerd[1493]: time="2025-01-17T13:43:27.919596375Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238579" Jan 17 13:43:27.920364 containerd[1493]: time="2025-01-17T13:43:27.920312248Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:27.924691 containerd[1493]: time="2025-01-17T13:43:27.924632242Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:43:27.926741 containerd[1493]: time="2025-01-17T13:43:27.926501590Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 4.675075356s" Jan 17 13:43:27.926741 containerd[1493]: time="2025-01-17T13:43:27.926551511Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Jan 17 13:43:28.106617 update_engine[1482]: I20250117 13:43:28.106449 1482 update_attempter.cc:509] Updating boot flags... Jan 17 13:43:28.161769 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2178) Jan 17 13:43:28.239251 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 36 scanned by (udev-worker) (2179) Jan 17 13:43:32.421403 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:32.431523 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:32.466169 systemd[1]: Reloading requested from client PID 2240 ('systemctl') (unit session-11.scope)... Jan 17 13:43:32.466230 systemd[1]: Reloading... Jan 17 13:43:32.637234 zram_generator::config[2279]: No configuration found. Jan 17 13:43:32.795209 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 17 13:43:32.903642 systemd[1]: Reloading finished in 436 ms. Jan 17 13:43:32.974494 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:32.979829 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:32.982400 systemd[1]: kubelet.service: Deactivated successfully. Jan 17 13:43:32.982706 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:32.987490 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:33.134526 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:33.142682 (kubelet)[2349]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 17 13:43:33.239908 kubelet[2349]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 17 13:43:33.240711 kubelet[2349]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 17 13:43:33.240711 kubelet[2349]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 17 13:43:33.243252 kubelet[2349]: I0117 13:43:33.242796 2349 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 17 13:43:34.430016 kubelet[2349]: I0117 13:43:34.429657 2349 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 17 13:43:34.430016 kubelet[2349]: I0117 13:43:34.430011 2349 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 17 13:43:34.430705 kubelet[2349]: I0117 13:43:34.430665 2349 server.go:927] "Client rotation is on, will bootstrap in background" Jan 17 13:43:34.453449 kubelet[2349]: I0117 13:43:34.453414 2349 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 17 13:43:34.455716 kubelet[2349]: E0117 13:43:34.455663 2349 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.230.9.254:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.475094 kubelet[2349]: I0117 13:43:34.475040 2349 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 17 13:43:34.477973 kubelet[2349]: I0117 13:43:34.477873 2349 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 17 13:43:34.478415 kubelet[2349]: I0117 13:43:34.477957 2349 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-so9hk.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 17 13:43:34.478700 kubelet[2349]: I0117 13:43:34.478451 2349 topology_manager.go:138] "Creating topology manager with none policy" Jan 17 13:43:34.478700 kubelet[2349]: I0117 13:43:34.478469 2349 container_manager_linux.go:301] "Creating device plugin manager" Jan 17 13:43:34.478849 kubelet[2349]: I0117 13:43:34.478700 2349 state_mem.go:36] "Initialized new in-memory state store" Jan 17 13:43:34.479782 kubelet[2349]: I0117 13:43:34.479748 2349 kubelet.go:400] "Attempting to sync node with API server" Jan 17 13:43:34.479782 kubelet[2349]: I0117 13:43:34.479775 2349 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 17 13:43:34.479935 kubelet[2349]: I0117 13:43:34.479827 2349 kubelet.go:312] "Adding apiserver pod source" Jan 17 13:43:34.479935 kubelet[2349]: I0117 13:43:34.479870 2349 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 17 13:43:34.484217 kubelet[2349]: W0117 13:43:34.484051 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.9.254:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.484217 kubelet[2349]: E0117 13:43:34.484171 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.230.9.254:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.485201 kubelet[2349]: W0117 13:43:34.484782 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.9.254:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-so9hk.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.485201 kubelet[2349]: E0117 13:43:34.484896 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.230.9.254:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-so9hk.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.485201 kubelet[2349]: I0117 13:43:34.485009 2349 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 17 13:43:34.486833 kubelet[2349]: I0117 13:43:34.486811 2349 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 17 13:43:34.487048 kubelet[2349]: W0117 13:43:34.487028 2349 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 17 13:43:34.489452 kubelet[2349]: I0117 13:43:34.489429 2349 server.go:1264] "Started kubelet" Jan 17 13:43:34.490594 kubelet[2349]: I0117 13:43:34.490531 2349 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 17 13:43:34.495147 kubelet[2349]: I0117 13:43:34.494463 2349 server.go:455] "Adding debug handlers to kubelet server" Jan 17 13:43:34.497056 kubelet[2349]: I0117 13:43:34.496312 2349 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 17 13:43:34.497056 kubelet[2349]: I0117 13:43:34.496681 2349 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 17 13:43:34.497056 kubelet[2349]: E0117 13:43:34.496891 2349 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.9.254:6443/api/v1/namespaces/default/events\": dial tcp 10.230.9.254:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-so9hk.gb1.brightbox.com.181b7eb90a5d558e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-so9hk.gb1.brightbox.com,UID:srv-so9hk.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-so9hk.gb1.brightbox.com,},FirstTimestamp:2025-01-17 13:43:34.489396622 +0000 UTC m=+1.341921447,LastTimestamp:2025-01-17 13:43:34.489396622 +0000 UTC m=+1.341921447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-so9hk.gb1.brightbox.com,}" Jan 17 13:43:34.498430 kubelet[2349]: I0117 13:43:34.497849 2349 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 17 13:43:34.506461 kubelet[2349]: E0117 13:43:34.506431 2349 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"srv-so9hk.gb1.brightbox.com\" not found" Jan 17 13:43:34.506792 kubelet[2349]: I0117 13:43:34.506769 2349 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 17 13:43:34.507147 kubelet[2349]: I0117 13:43:34.507124 2349 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 17 13:43:34.507384 kubelet[2349]: I0117 13:43:34.507363 2349 reconciler.go:26] "Reconciler: start to sync state" Jan 17 13:43:34.507949 kubelet[2349]: W0117 13:43:34.507903 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.9.254:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.508110 kubelet[2349]: E0117 13:43:34.508088 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.230.9.254:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.508999 kubelet[2349]: E0117 13:43:34.508939 2349 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.9.254:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-so9hk.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.9.254:6443: connect: connection refused" interval="200ms" Jan 17 13:43:34.509430 kubelet[2349]: E0117 13:43:34.509404 2349 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 17 13:43:34.510686 kubelet[2349]: I0117 13:43:34.510436 2349 factory.go:221] Registration of the systemd container factory successfully Jan 17 13:43:34.510686 kubelet[2349]: I0117 13:43:34.510562 2349 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 17 13:43:34.518293 kubelet[2349]: I0117 13:43:34.518140 2349 factory.go:221] Registration of the containerd container factory successfully Jan 17 13:43:34.535232 kubelet[2349]: I0117 13:43:34.535052 2349 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 17 13:43:34.537047 kubelet[2349]: I0117 13:43:34.537022 2349 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 17 13:43:34.537692 kubelet[2349]: I0117 13:43:34.537223 2349 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 17 13:43:34.537692 kubelet[2349]: I0117 13:43:34.537283 2349 kubelet.go:2337] "Starting kubelet main sync loop" Jan 17 13:43:34.537692 kubelet[2349]: E0117 13:43:34.537374 2349 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 17 13:43:34.547822 kubelet[2349]: W0117 13:43:34.547507 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.9.254:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.547822 kubelet[2349]: E0117 13:43:34.547573 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.230.9.254:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:34.555046 kubelet[2349]: I0117 13:43:34.554963 2349 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 17 13:43:34.555046 kubelet[2349]: I0117 13:43:34.554985 2349 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 17 13:43:34.555046 kubelet[2349]: I0117 13:43:34.555011 2349 state_mem.go:36] "Initialized new in-memory state store" Jan 17 13:43:34.556842 kubelet[2349]: I0117 13:43:34.556814 2349 policy_none.go:49] "None policy: Start" Jan 17 13:43:34.557565 kubelet[2349]: I0117 13:43:34.557541 2349 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 17 13:43:34.557664 kubelet[2349]: I0117 13:43:34.557577 2349 state_mem.go:35] "Initializing new in-memory state store" Jan 17 13:43:34.569521 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 17 13:43:34.580416 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 17 13:43:34.586541 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 17 13:43:34.598441 kubelet[2349]: I0117 13:43:34.598408 2349 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 17 13:43:34.599231 kubelet[2349]: I0117 13:43:34.598671 2349 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 17 13:43:34.599231 kubelet[2349]: I0117 13:43:34.598868 2349 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 17 13:43:34.602455 kubelet[2349]: E0117 13:43:34.602416 2349 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-so9hk.gb1.brightbox.com\" not found" Jan 17 13:43:34.609602 kubelet[2349]: I0117 13:43:34.609489 2349 kubelet_node_status.go:73] "Attempting to register node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.610151 kubelet[2349]: E0117 13:43:34.610097 2349 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.9.254:6443/api/v1/nodes\": dial tcp 10.230.9.254:6443: connect: connection refused" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.637560 kubelet[2349]: I0117 13:43:34.637491 2349 topology_manager.go:215] "Topology Admit Handler" podUID="4035e663879a9dba1dba86d82573ddde" podNamespace="kube-system" podName="kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.640018 kubelet[2349]: I0117 13:43:34.639756 2349 topology_manager.go:215] "Topology Admit Handler" podUID="266bc621e6a5d318973363647409176b" podNamespace="kube-system" podName="kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.641843 kubelet[2349]: I0117 13:43:34.641814 2349 topology_manager.go:215] "Topology Admit Handler" podUID="ed71dbacd1520a800c6fdbef84791128" podNamespace="kube-system" podName="kube-scheduler-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.651873 systemd[1]: Created slice kubepods-burstable-pod4035e663879a9dba1dba86d82573ddde.slice - libcontainer container kubepods-burstable-pod4035e663879a9dba1dba86d82573ddde.slice. Jan 17 13:43:34.664236 systemd[1]: Created slice kubepods-burstable-pod266bc621e6a5d318973363647409176b.slice - libcontainer container kubepods-burstable-pod266bc621e6a5d318973363647409176b.slice. Jan 17 13:43:34.678077 systemd[1]: Created slice kubepods-burstable-poded71dbacd1520a800c6fdbef84791128.slice - libcontainer container kubepods-burstable-poded71dbacd1520a800c6fdbef84791128.slice. Jan 17 13:43:34.709786 kubelet[2349]: E0117 13:43:34.709652 2349 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.9.254:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-so9hk.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.9.254:6443: connect: connection refused" interval="400ms" Jan 17 13:43:34.808432 kubelet[2349]: I0117 13:43:34.808301 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4035e663879a9dba1dba86d82573ddde-usr-share-ca-certificates\") pod \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" (UID: \"4035e663879a9dba1dba86d82573ddde\") " pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808432 kubelet[2349]: I0117 13:43:34.808371 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-ca-certs\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808432 kubelet[2349]: I0117 13:43:34.808402 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-flexvolume-dir\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808432 kubelet[2349]: I0117 13:43:34.808429 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-k8s-certs\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808736 kubelet[2349]: I0117 13:43:34.808454 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-kubeconfig\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808736 kubelet[2349]: I0117 13:43:34.808480 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808736 kubelet[2349]: I0117 13:43:34.808524 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ed71dbacd1520a800c6fdbef84791128-kubeconfig\") pod \"kube-scheduler-srv-so9hk.gb1.brightbox.com\" (UID: \"ed71dbacd1520a800c6fdbef84791128\") " pod="kube-system/kube-scheduler-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808736 kubelet[2349]: I0117 13:43:34.808555 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4035e663879a9dba1dba86d82573ddde-ca-certs\") pod \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" (UID: \"4035e663879a9dba1dba86d82573ddde\") " pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.808736 kubelet[2349]: I0117 13:43:34.808597 2349 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4035e663879a9dba1dba86d82573ddde-k8s-certs\") pod \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" (UID: \"4035e663879a9dba1dba86d82573ddde\") " pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.812758 kubelet[2349]: I0117 13:43:34.812722 2349 kubelet_node_status.go:73] "Attempting to register node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.813174 kubelet[2349]: E0117 13:43:34.813127 2349 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.9.254:6443/api/v1/nodes\": dial tcp 10.230.9.254:6443: connect: connection refused" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:34.964620 containerd[1493]: time="2025-01-17T13:43:34.964451310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-so9hk.gb1.brightbox.com,Uid:4035e663879a9dba1dba86d82573ddde,Namespace:kube-system,Attempt:0,}" Jan 17 13:43:34.980901 containerd[1493]: time="2025-01-17T13:43:34.980695326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-so9hk.gb1.brightbox.com,Uid:266bc621e6a5d318973363647409176b,Namespace:kube-system,Attempt:0,}" Jan 17 13:43:34.983089 containerd[1493]: time="2025-01-17T13:43:34.982814591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-so9hk.gb1.brightbox.com,Uid:ed71dbacd1520a800c6fdbef84791128,Namespace:kube-system,Attempt:0,}" Jan 17 13:43:35.111004 kubelet[2349]: E0117 13:43:35.110925 2349 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.9.254:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-so9hk.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.9.254:6443: connect: connection refused" interval="800ms" Jan 17 13:43:35.217245 kubelet[2349]: I0117 13:43:35.216631 2349 kubelet_node_status.go:73] "Attempting to register node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:35.217245 kubelet[2349]: E0117 13:43:35.217060 2349 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.9.254:6443/api/v1/nodes\": dial tcp 10.230.9.254:6443: connect: connection refused" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:35.484254 kubelet[2349]: W0117 13:43:35.483956 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.9.254:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:35.484254 kubelet[2349]: E0117 13:43:35.484046 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.230.9.254:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:35.550600 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1277852707.mount: Deactivated successfully. Jan 17 13:43:35.559542 containerd[1493]: time="2025-01-17T13:43:35.558030693Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 17 13:43:35.559677 containerd[1493]: time="2025-01-17T13:43:35.559592669Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 17 13:43:35.561748 containerd[1493]: time="2025-01-17T13:43:35.561711528Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 17 13:43:35.563060 containerd[1493]: time="2025-01-17T13:43:35.563009276Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 17 13:43:35.564267 containerd[1493]: time="2025-01-17T13:43:35.564218363Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 17 13:43:35.566616 containerd[1493]: time="2025-01-17T13:43:35.566568839Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 17 13:43:35.566947 containerd[1493]: time="2025-01-17T13:43:35.566904263Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312064" Jan 17 13:43:35.570058 containerd[1493]: time="2025-01-17T13:43:35.570020224Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 17 13:43:35.573570 containerd[1493]: time="2025-01-17T13:43:35.573524976Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 608.904256ms" Jan 17 13:43:35.577223 containerd[1493]: time="2025-01-17T13:43:35.577139365Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 594.251778ms" Jan 17 13:43:35.577535 containerd[1493]: time="2025-01-17T13:43:35.577494407Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 596.697433ms" Jan 17 13:43:35.643171 kubelet[2349]: W0117 13:43:35.642436 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.9.254:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-so9hk.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:35.643171 kubelet[2349]: E0117 13:43:35.642839 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.230.9.254:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-so9hk.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:35.680320 kubelet[2349]: W0117 13:43:35.679958 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.9.254:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:35.680320 kubelet[2349]: E0117 13:43:35.680230 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.230.9.254:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:35.802306 containerd[1493]: time="2025-01-17T13:43:35.801057307Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:43:35.802487 containerd[1493]: time="2025-01-17T13:43:35.801251294Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:43:35.802487 containerd[1493]: time="2025-01-17T13:43:35.801288616Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:35.802487 containerd[1493]: time="2025-01-17T13:43:35.801493715Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:35.811289 containerd[1493]: time="2025-01-17T13:43:35.809801752Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:43:35.811289 containerd[1493]: time="2025-01-17T13:43:35.809863981Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:43:35.811289 containerd[1493]: time="2025-01-17T13:43:35.809890039Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:35.811289 containerd[1493]: time="2025-01-17T13:43:35.809998911Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:35.815345 containerd[1493]: time="2025-01-17T13:43:35.812246838Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:43:35.815492 containerd[1493]: time="2025-01-17T13:43:35.815416385Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:43:35.815492 containerd[1493]: time="2025-01-17T13:43:35.815441884Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:35.815693 containerd[1493]: time="2025-01-17T13:43:35.815571991Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:35.847415 systemd[1]: Started cri-containerd-f7ab528a04fb5b250fcc57ba4413267ec28cb9727654d7d162bdf1e016999214.scope - libcontainer container f7ab528a04fb5b250fcc57ba4413267ec28cb9727654d7d162bdf1e016999214. Jan 17 13:43:35.871732 systemd[1]: Started cri-containerd-819b49dd97808b5bbd9a4b6061b8e0cdc33497c44a6fee48fa63893ce7441ba5.scope - libcontainer container 819b49dd97808b5bbd9a4b6061b8e0cdc33497c44a6fee48fa63893ce7441ba5. Jan 17 13:43:35.875335 systemd[1]: Started cri-containerd-cf8da3ef4920a90d9552949f1b81e1fca5c47fdf72f5c9011b2ecd24535e8327.scope - libcontainer container cf8da3ef4920a90d9552949f1b81e1fca5c47fdf72f5c9011b2ecd24535e8327. Jan 17 13:43:35.912711 kubelet[2349]: E0117 13:43:35.912633 2349 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.9.254:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-so9hk.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.9.254:6443: connect: connection refused" interval="1.6s" Jan 17 13:43:35.965277 containerd[1493]: time="2025-01-17T13:43:35.965044276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-so9hk.gb1.brightbox.com,Uid:ed71dbacd1520a800c6fdbef84791128,Namespace:kube-system,Attempt:0,} returns sandbox id \"f7ab528a04fb5b250fcc57ba4413267ec28cb9727654d7d162bdf1e016999214\"" Jan 17 13:43:35.988658 containerd[1493]: time="2025-01-17T13:43:35.987950821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-so9hk.gb1.brightbox.com,Uid:4035e663879a9dba1dba86d82573ddde,Namespace:kube-system,Attempt:0,} returns sandbox id \"819b49dd97808b5bbd9a4b6061b8e0cdc33497c44a6fee48fa63893ce7441ba5\"" Jan 17 13:43:35.988658 containerd[1493]: time="2025-01-17T13:43:35.988379600Z" level=info msg="CreateContainer within sandbox \"f7ab528a04fb5b250fcc57ba4413267ec28cb9727654d7d162bdf1e016999214\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 17 13:43:35.995043 containerd[1493]: time="2025-01-17T13:43:35.994868912Z" level=info msg="CreateContainer within sandbox \"819b49dd97808b5bbd9a4b6061b8e0cdc33497c44a6fee48fa63893ce7441ba5\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 17 13:43:35.996453 containerd[1493]: time="2025-01-17T13:43:35.996255741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-so9hk.gb1.brightbox.com,Uid:266bc621e6a5d318973363647409176b,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf8da3ef4920a90d9552949f1b81e1fca5c47fdf72f5c9011b2ecd24535e8327\"" Jan 17 13:43:36.003553 containerd[1493]: time="2025-01-17T13:43:36.003311282Z" level=info msg="CreateContainer within sandbox \"cf8da3ef4920a90d9552949f1b81e1fca5c47fdf72f5c9011b2ecd24535e8327\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 17 13:43:36.009432 kubelet[2349]: W0117 13:43:36.009109 2349 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.9.254:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:36.009745 kubelet[2349]: E0117 13:43:36.009658 2349 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.230.9.254:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:36.020666 containerd[1493]: time="2025-01-17T13:43:36.020617219Z" level=info msg="CreateContainer within sandbox \"819b49dd97808b5bbd9a4b6061b8e0cdc33497c44a6fee48fa63893ce7441ba5\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"dc611d36053e0a0441d4da4dcae9c023b29de60f15e0d78cc904bfecd391ce2b\"" Jan 17 13:43:36.021387 containerd[1493]: time="2025-01-17T13:43:36.021075531Z" level=info msg="CreateContainer within sandbox \"f7ab528a04fb5b250fcc57ba4413267ec28cb9727654d7d162bdf1e016999214\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c65dc13bf9d7b6e943190061e82c07e4abc00e6c67c74fb7c7b453a5413df3a6\"" Jan 17 13:43:36.022209 containerd[1493]: time="2025-01-17T13:43:36.022086552Z" level=info msg="StartContainer for \"dc611d36053e0a0441d4da4dcae9c023b29de60f15e0d78cc904bfecd391ce2b\"" Jan 17 13:43:36.025141 containerd[1493]: time="2025-01-17T13:43:36.024361803Z" level=info msg="StartContainer for \"c65dc13bf9d7b6e943190061e82c07e4abc00e6c67c74fb7c7b453a5413df3a6\"" Jan 17 13:43:36.028359 kubelet[2349]: I0117 13:43:36.028001 2349 kubelet_node_status.go:73] "Attempting to register node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:36.029951 kubelet[2349]: E0117 13:43:36.028547 2349 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.230.9.254:6443/api/v1/nodes\": dial tcp 10.230.9.254:6443: connect: connection refused" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:36.030775 containerd[1493]: time="2025-01-17T13:43:36.030736767Z" level=info msg="CreateContainer within sandbox \"cf8da3ef4920a90d9552949f1b81e1fca5c47fdf72f5c9011b2ecd24535e8327\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"82cac9d167ef0c06847668179769d1acbd36aaeb0c863c42b870e3409ca6cbef\"" Jan 17 13:43:36.031313 containerd[1493]: time="2025-01-17T13:43:36.031281079Z" level=info msg="StartContainer for \"82cac9d167ef0c06847668179769d1acbd36aaeb0c863c42b870e3409ca6cbef\"" Jan 17 13:43:36.075389 systemd[1]: Started cri-containerd-c65dc13bf9d7b6e943190061e82c07e4abc00e6c67c74fb7c7b453a5413df3a6.scope - libcontainer container c65dc13bf9d7b6e943190061e82c07e4abc00e6c67c74fb7c7b453a5413df3a6. Jan 17 13:43:36.092149 systemd[1]: Started cri-containerd-82cac9d167ef0c06847668179769d1acbd36aaeb0c863c42b870e3409ca6cbef.scope - libcontainer container 82cac9d167ef0c06847668179769d1acbd36aaeb0c863c42b870e3409ca6cbef. Jan 17 13:43:36.108387 systemd[1]: Started cri-containerd-dc611d36053e0a0441d4da4dcae9c023b29de60f15e0d78cc904bfecd391ce2b.scope - libcontainer container dc611d36053e0a0441d4da4dcae9c023b29de60f15e0d78cc904bfecd391ce2b. Jan 17 13:43:36.196959 containerd[1493]: time="2025-01-17T13:43:36.196912627Z" level=info msg="StartContainer for \"c65dc13bf9d7b6e943190061e82c07e4abc00e6c67c74fb7c7b453a5413df3a6\" returns successfully" Jan 17 13:43:36.227213 containerd[1493]: time="2025-01-17T13:43:36.226744730Z" level=info msg="StartContainer for \"82cac9d167ef0c06847668179769d1acbd36aaeb0c863c42b870e3409ca6cbef\" returns successfully" Jan 17 13:43:36.235960 containerd[1493]: time="2025-01-17T13:43:36.235918853Z" level=info msg="StartContainer for \"dc611d36053e0a0441d4da4dcae9c023b29de60f15e0d78cc904bfecd391ce2b\" returns successfully" Jan 17 13:43:36.632596 kubelet[2349]: E0117 13:43:36.632519 2349 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.230.9.254:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.230.9.254:6443: connect: connection refused Jan 17 13:43:37.633458 kubelet[2349]: I0117 13:43:37.633402 2349 kubelet_node_status.go:73] "Attempting to register node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:38.936046 kubelet[2349]: E0117 13:43:38.935977 2349 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-so9hk.gb1.brightbox.com\" not found" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:39.032265 kubelet[2349]: I0117 13:43:39.032214 2349 kubelet_node_status.go:76] "Successfully registered node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:39.084222 kubelet[2349]: E0117 13:43:39.083936 2349 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{srv-so9hk.gb1.brightbox.com.181b7eb90a5d558e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-so9hk.gb1.brightbox.com,UID:srv-so9hk.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-so9hk.gb1.brightbox.com,},FirstTimestamp:2025-01-17 13:43:34.489396622 +0000 UTC m=+1.341921447,LastTimestamp:2025-01-17 13:43:34.489396622 +0000 UTC m=+1.341921447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-so9hk.gb1.brightbox.com,}" Jan 17 13:43:39.484836 kubelet[2349]: I0117 13:43:39.484759 2349 apiserver.go:52] "Watching apiserver" Jan 17 13:43:39.507719 kubelet[2349]: I0117 13:43:39.507655 2349 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 17 13:43:39.597462 kubelet[2349]: E0117 13:43:39.597407 2349 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-srv-so9hk.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:41.174801 systemd[1]: Reloading requested from client PID 2626 ('systemctl') (unit session-11.scope)... Jan 17 13:43:41.174852 systemd[1]: Reloading... Jan 17 13:43:41.325241 zram_generator::config[2674]: No configuration found. Jan 17 13:43:41.496653 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 17 13:43:41.633769 systemd[1]: Reloading finished in 458 ms. Jan 17 13:43:41.702501 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:41.714364 systemd[1]: kubelet.service: Deactivated successfully. Jan 17 13:43:41.714848 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:41.715003 systemd[1]: kubelet.service: Consumed 1.861s CPU time, 112.6M memory peak, 0B memory swap peak. Jan 17 13:43:41.721583 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 17 13:43:41.974436 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 17 13:43:41.983739 (kubelet)[2728]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 17 13:43:42.120773 kubelet[2728]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 17 13:43:42.120773 kubelet[2728]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 17 13:43:42.120773 kubelet[2728]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 17 13:43:42.122312 kubelet[2728]: I0117 13:43:42.121834 2728 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 17 13:43:42.132217 kubelet[2728]: I0117 13:43:42.131622 2728 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Jan 17 13:43:42.132217 kubelet[2728]: I0117 13:43:42.131659 2728 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 17 13:43:42.132217 kubelet[2728]: I0117 13:43:42.131974 2728 server.go:927] "Client rotation is on, will bootstrap in background" Jan 17 13:43:42.135411 kubelet[2728]: I0117 13:43:42.135382 2728 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 17 13:43:42.139340 kubelet[2728]: I0117 13:43:42.138443 2728 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 17 13:43:42.154348 kubelet[2728]: I0117 13:43:42.153889 2728 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 17 13:43:42.154526 kubelet[2728]: I0117 13:43:42.154351 2728 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 17 13:43:42.154815 kubelet[2728]: I0117 13:43:42.154399 2728 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-so9hk.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 17 13:43:42.157557 kubelet[2728]: I0117 13:43:42.156607 2728 topology_manager.go:138] "Creating topology manager with none policy" Jan 17 13:43:42.157557 kubelet[2728]: I0117 13:43:42.156638 2728 container_manager_linux.go:301] "Creating device plugin manager" Jan 17 13:43:42.157557 kubelet[2728]: I0117 13:43:42.156752 2728 state_mem.go:36] "Initialized new in-memory state store" Jan 17 13:43:42.158097 kubelet[2728]: I0117 13:43:42.157921 2728 kubelet.go:400] "Attempting to sync node with API server" Jan 17 13:43:42.158097 kubelet[2728]: I0117 13:43:42.157983 2728 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 17 13:43:42.163369 kubelet[2728]: I0117 13:43:42.163343 2728 kubelet.go:312] "Adding apiserver pod source" Jan 17 13:43:42.166129 kubelet[2728]: I0117 13:43:42.164837 2728 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 17 13:43:42.181341 kubelet[2728]: I0117 13:43:42.181306 2728 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.21" apiVersion="v1" Jan 17 13:43:42.186100 kubelet[2728]: I0117 13:43:42.184907 2728 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 17 13:43:42.196797 kubelet[2728]: I0117 13:43:42.196772 2728 server.go:1264] "Started kubelet" Jan 17 13:43:42.199123 kubelet[2728]: I0117 13:43:42.199009 2728 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 17 13:43:42.204200 kubelet[2728]: I0117 13:43:42.203404 2728 server.go:455] "Adding debug handlers to kubelet server" Jan 17 13:43:42.205915 kubelet[2728]: I0117 13:43:42.204748 2728 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 17 13:43:42.206583 kubelet[2728]: I0117 13:43:42.206545 2728 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 17 13:43:42.211614 kubelet[2728]: I0117 13:43:42.211594 2728 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 17 13:43:42.225658 kubelet[2728]: I0117 13:43:42.225434 2728 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 17 13:43:42.226369 kubelet[2728]: I0117 13:43:42.226346 2728 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Jan 17 13:43:42.229941 kubelet[2728]: I0117 13:43:42.228985 2728 reconciler.go:26] "Reconciler: start to sync state" Jan 17 13:43:42.232216 kubelet[2728]: E0117 13:43:42.232131 2728 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 17 13:43:42.235171 kubelet[2728]: I0117 13:43:42.235130 2728 factory.go:221] Registration of the systemd container factory successfully Jan 17 13:43:42.238242 kubelet[2728]: I0117 13:43:42.236866 2728 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 17 13:43:42.242103 kubelet[2728]: I0117 13:43:42.242030 2728 factory.go:221] Registration of the containerd container factory successfully Jan 17 13:43:42.276555 kubelet[2728]: I0117 13:43:42.276475 2728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 17 13:43:42.281407 kubelet[2728]: I0117 13:43:42.281375 2728 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 17 13:43:42.281508 kubelet[2728]: I0117 13:43:42.281440 2728 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 17 13:43:42.281508 kubelet[2728]: I0117 13:43:42.281478 2728 kubelet.go:2337] "Starting kubelet main sync loop" Jan 17 13:43:42.281612 kubelet[2728]: E0117 13:43:42.281559 2728 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 17 13:43:42.351614 kubelet[2728]: I0117 13:43:42.351284 2728 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 17 13:43:42.351614 kubelet[2728]: I0117 13:43:42.351314 2728 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 17 13:43:42.351614 kubelet[2728]: I0117 13:43:42.351346 2728 state_mem.go:36] "Initialized new in-memory state store" Jan 17 13:43:42.351614 kubelet[2728]: I0117 13:43:42.351597 2728 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 17 13:43:42.352810 kubelet[2728]: I0117 13:43:42.351617 2728 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 17 13:43:42.352810 kubelet[2728]: I0117 13:43:42.351746 2728 policy_none.go:49] "None policy: Start" Jan 17 13:43:42.353640 kubelet[2728]: I0117 13:43:42.353611 2728 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 17 13:43:42.353728 kubelet[2728]: I0117 13:43:42.353659 2728 state_mem.go:35] "Initializing new in-memory state store" Jan 17 13:43:42.354770 kubelet[2728]: I0117 13:43:42.353846 2728 state_mem.go:75] "Updated machine memory state" Jan 17 13:43:42.362386 kubelet[2728]: I0117 13:43:42.361899 2728 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 17 13:43:42.362386 kubelet[2728]: I0117 13:43:42.362212 2728 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 17 13:43:42.378859 kubelet[2728]: I0117 13:43:42.378822 2728 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 17 13:43:42.382368 kubelet[2728]: I0117 13:43:42.382151 2728 topology_manager.go:215] "Topology Admit Handler" podUID="4035e663879a9dba1dba86d82573ddde" podNamespace="kube-system" podName="kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.383856 kubelet[2728]: I0117 13:43:42.383545 2728 topology_manager.go:215] "Topology Admit Handler" podUID="266bc621e6a5d318973363647409176b" podNamespace="kube-system" podName="kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.386738 kubelet[2728]: I0117 13:43:42.386691 2728 topology_manager.go:215] "Topology Admit Handler" podUID="ed71dbacd1520a800c6fdbef84791128" podNamespace="kube-system" podName="kube-scheduler-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.408268 kubelet[2728]: W0117 13:43:42.406657 2728 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 17 13:43:42.411862 kubelet[2728]: W0117 13:43:42.410435 2728 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 17 13:43:42.412288 kubelet[2728]: W0117 13:43:42.410638 2728 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 17 13:43:42.431645 kubelet[2728]: I0117 13:43:42.431599 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-flexvolume-dir\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432247 kubelet[2728]: I0117 13:43:42.431840 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-kubeconfig\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432247 kubelet[2728]: I0117 13:43:42.431898 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432247 kubelet[2728]: I0117 13:43:42.431982 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ed71dbacd1520a800c6fdbef84791128-kubeconfig\") pod \"kube-scheduler-srv-so9hk.gb1.brightbox.com\" (UID: \"ed71dbacd1520a800c6fdbef84791128\") " pod="kube-system/kube-scheduler-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432247 kubelet[2728]: I0117 13:43:42.432016 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-ca-certs\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432247 kubelet[2728]: I0117 13:43:42.432054 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4035e663879a9dba1dba86d82573ddde-k8s-certs\") pod \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" (UID: \"4035e663879a9dba1dba86d82573ddde\") " pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432564 kubelet[2728]: I0117 13:43:42.432107 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4035e663879a9dba1dba86d82573ddde-usr-share-ca-certificates\") pod \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" (UID: \"4035e663879a9dba1dba86d82573ddde\") " pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432564 kubelet[2728]: I0117 13:43:42.432132 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/266bc621e6a5d318973363647409176b-k8s-certs\") pod \"kube-controller-manager-srv-so9hk.gb1.brightbox.com\" (UID: \"266bc621e6a5d318973363647409176b\") " pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.432564 kubelet[2728]: I0117 13:43:42.432179 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4035e663879a9dba1dba86d82573ddde-ca-certs\") pod \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" (UID: \"4035e663879a9dba1dba86d82573ddde\") " pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.503072 kubelet[2728]: I0117 13:43:42.499642 2728 kubelet_node_status.go:73] "Attempting to register node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.512220 kubelet[2728]: I0117 13:43:42.511916 2728 kubelet_node_status.go:112] "Node was previously registered" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:42.512220 kubelet[2728]: I0117 13:43:42.512038 2728 kubelet_node_status.go:76] "Successfully registered node" node="srv-so9hk.gb1.brightbox.com" Jan 17 13:43:43.166071 kubelet[2728]: I0117 13:43:43.165781 2728 apiserver.go:52] "Watching apiserver" Jan 17 13:43:43.227295 kubelet[2728]: I0117 13:43:43.227156 2728 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Jan 17 13:43:43.338227 kubelet[2728]: W0117 13:43:43.338068 2728 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Jan 17 13:43:43.341199 kubelet[2728]: E0117 13:43:43.338823 2728 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-so9hk.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" Jan 17 13:43:43.429648 kubelet[2728]: I0117 13:43:43.428675 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-so9hk.gb1.brightbox.com" podStartSLOduration=1.428626214 podStartE2EDuration="1.428626214s" podCreationTimestamp="2025-01-17 13:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-17 13:43:43.381378791 +0000 UTC m=+1.368957366" watchObservedRunningTime="2025-01-17 13:43:43.428626214 +0000 UTC m=+1.416204800" Jan 17 13:43:43.471601 kubelet[2728]: I0117 13:43:43.471016 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-so9hk.gb1.brightbox.com" podStartSLOduration=1.47099214 podStartE2EDuration="1.47099214s" podCreationTimestamp="2025-01-17 13:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-17 13:43:43.43242289 +0000 UTC m=+1.420001458" watchObservedRunningTime="2025-01-17 13:43:43.47099214 +0000 UTC m=+1.458570702" Jan 17 13:43:43.519057 kubelet[2728]: I0117 13:43:43.518816 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-so9hk.gb1.brightbox.com" podStartSLOduration=1.518795275 podStartE2EDuration="1.518795275s" podCreationTimestamp="2025-01-17 13:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-17 13:43:43.47213195 +0000 UTC m=+1.459710538" watchObservedRunningTime="2025-01-17 13:43:43.518795275 +0000 UTC m=+1.506373855" Jan 17 13:43:48.191814 sudo[1773]: pam_unix(sudo:session): session closed for user root Jan 17 13:43:48.340446 sshd[1770]: pam_unix(sshd:session): session closed for user core Jan 17 13:43:48.347048 systemd[1]: sshd@8-10.230.9.254:22-139.178.68.195:57936.service: Deactivated successfully. Jan 17 13:43:48.349858 systemd[1]: session-11.scope: Deactivated successfully. Jan 17 13:43:48.350153 systemd[1]: session-11.scope: Consumed 6.944s CPU time, 188.5M memory peak, 0B memory swap peak. Jan 17 13:43:48.352548 systemd-logind[1480]: Session 11 logged out. Waiting for processes to exit. Jan 17 13:43:48.355264 systemd-logind[1480]: Removed session 11. Jan 17 13:43:56.397136 kubelet[2728]: I0117 13:43:56.396968 2728 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 17 13:43:56.401374 containerd[1493]: time="2025-01-17T13:43:56.399607716Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 17 13:43:56.402054 kubelet[2728]: I0117 13:43:56.399940 2728 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 17 13:43:57.296292 kubelet[2728]: I0117 13:43:57.295960 2728 topology_manager.go:215] "Topology Admit Handler" podUID="f3814ba0-20fc-4701-8acb-09867fc689a5" podNamespace="kube-system" podName="kube-proxy-qg854" Jan 17 13:43:57.318282 systemd[1]: Created slice kubepods-besteffort-podf3814ba0_20fc_4701_8acb_09867fc689a5.slice - libcontainer container kubepods-besteffort-podf3814ba0_20fc_4701_8acb_09867fc689a5.slice. Jan 17 13:43:57.324808 kubelet[2728]: I0117 13:43:57.324517 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f3814ba0-20fc-4701-8acb-09867fc689a5-lib-modules\") pod \"kube-proxy-qg854\" (UID: \"f3814ba0-20fc-4701-8acb-09867fc689a5\") " pod="kube-system/kube-proxy-qg854" Jan 17 13:43:57.324808 kubelet[2728]: I0117 13:43:57.324591 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqdsc\" (UniqueName: \"kubernetes.io/projected/f3814ba0-20fc-4701-8acb-09867fc689a5-kube-api-access-fqdsc\") pod \"kube-proxy-qg854\" (UID: \"f3814ba0-20fc-4701-8acb-09867fc689a5\") " pod="kube-system/kube-proxy-qg854" Jan 17 13:43:57.324808 kubelet[2728]: I0117 13:43:57.324635 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f3814ba0-20fc-4701-8acb-09867fc689a5-kube-proxy\") pod \"kube-proxy-qg854\" (UID: \"f3814ba0-20fc-4701-8acb-09867fc689a5\") " pod="kube-system/kube-proxy-qg854" Jan 17 13:43:57.324808 kubelet[2728]: I0117 13:43:57.324665 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f3814ba0-20fc-4701-8acb-09867fc689a5-xtables-lock\") pod \"kube-proxy-qg854\" (UID: \"f3814ba0-20fc-4701-8acb-09867fc689a5\") " pod="kube-system/kube-proxy-qg854" Jan 17 13:43:57.493318 kubelet[2728]: I0117 13:43:57.493193 2728 topology_manager.go:215] "Topology Admit Handler" podUID="113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-sm6cm" Jan 17 13:43:57.510175 systemd[1]: Created slice kubepods-besteffort-pod113dfdf9_0e6a_4a24_abf3_4dfa8ee07d3f.slice - libcontainer container kubepods-besteffort-pod113dfdf9_0e6a_4a24_abf3_4dfa8ee07d3f.slice. Jan 17 13:43:57.526306 kubelet[2728]: I0117 13:43:57.526232 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmb8r\" (UniqueName: \"kubernetes.io/projected/113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f-kube-api-access-tmb8r\") pod \"tigera-operator-7bc55997bb-sm6cm\" (UID: \"113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f\") " pod="tigera-operator/tigera-operator-7bc55997bb-sm6cm" Jan 17 13:43:57.526539 kubelet[2728]: I0117 13:43:57.526316 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f-var-lib-calico\") pod \"tigera-operator-7bc55997bb-sm6cm\" (UID: \"113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f\") " pod="tigera-operator/tigera-operator-7bc55997bb-sm6cm" Jan 17 13:43:57.632996 containerd[1493]: time="2025-01-17T13:43:57.632892620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qg854,Uid:f3814ba0-20fc-4701-8acb-09867fc689a5,Namespace:kube-system,Attempt:0,}" Jan 17 13:43:57.695131 containerd[1493]: time="2025-01-17T13:43:57.694716159Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:43:57.695131 containerd[1493]: time="2025-01-17T13:43:57.694896474Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:43:57.695131 containerd[1493]: time="2025-01-17T13:43:57.694924333Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:57.696538 containerd[1493]: time="2025-01-17T13:43:57.695521115Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:57.737580 systemd[1]: Started cri-containerd-0d930d27429c5f40aba363773ddb877b3df87679d0a0358be42554992dd9477e.scope - libcontainer container 0d930d27429c5f40aba363773ddb877b3df87679d0a0358be42554992dd9477e. Jan 17 13:43:57.787255 containerd[1493]: time="2025-01-17T13:43:57.787130733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qg854,Uid:f3814ba0-20fc-4701-8acb-09867fc689a5,Namespace:kube-system,Attempt:0,} returns sandbox id \"0d930d27429c5f40aba363773ddb877b3df87679d0a0358be42554992dd9477e\"" Jan 17 13:43:57.792996 containerd[1493]: time="2025-01-17T13:43:57.792888720Z" level=info msg="CreateContainer within sandbox \"0d930d27429c5f40aba363773ddb877b3df87679d0a0358be42554992dd9477e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 17 13:43:57.815555 containerd[1493]: time="2025-01-17T13:43:57.815466351Z" level=info msg="CreateContainer within sandbox \"0d930d27429c5f40aba363773ddb877b3df87679d0a0358be42554992dd9477e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"253589b564e186bb1b89bfd0c236864d9581e5e48ef64f0db935294e371a4cc1\"" Jan 17 13:43:57.816957 containerd[1493]: time="2025-01-17T13:43:57.816905590Z" level=info msg="StartContainer for \"253589b564e186bb1b89bfd0c236864d9581e5e48ef64f0db935294e371a4cc1\"" Jan 17 13:43:57.819114 containerd[1493]: time="2025-01-17T13:43:57.819079867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-sm6cm,Uid:113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f,Namespace:tigera-operator,Attempt:0,}" Jan 17 13:43:57.871424 systemd[1]: Started cri-containerd-253589b564e186bb1b89bfd0c236864d9581e5e48ef64f0db935294e371a4cc1.scope - libcontainer container 253589b564e186bb1b89bfd0c236864d9581e5e48ef64f0db935294e371a4cc1. Jan 17 13:43:57.873962 containerd[1493]: time="2025-01-17T13:43:57.870606849Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:43:57.873962 containerd[1493]: time="2025-01-17T13:43:57.870711469Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:43:57.873962 containerd[1493]: time="2025-01-17T13:43:57.870729197Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:57.873962 containerd[1493]: time="2025-01-17T13:43:57.870873791Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:43:57.913679 systemd[1]: Started cri-containerd-d159e412f05a9341f0c94300fc046e565a8d4ecd064c352e3d46740bae535654.scope - libcontainer container d159e412f05a9341f0c94300fc046e565a8d4ecd064c352e3d46740bae535654. Jan 17 13:43:57.963253 containerd[1493]: time="2025-01-17T13:43:57.963063796Z" level=info msg="StartContainer for \"253589b564e186bb1b89bfd0c236864d9581e5e48ef64f0db935294e371a4cc1\" returns successfully" Jan 17 13:43:58.012959 containerd[1493]: time="2025-01-17T13:43:58.012735266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-sm6cm,Uid:113dfdf9-0e6a-4a24-abf3-4dfa8ee07d3f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d159e412f05a9341f0c94300fc046e565a8d4ecd064c352e3d46740bae535654\"" Jan 17 13:43:58.016664 containerd[1493]: time="2025-01-17T13:43:58.016581344Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 17 13:43:58.384095 kubelet[2728]: I0117 13:43:58.383978 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qg854" podStartSLOduration=1.383933782 podStartE2EDuration="1.383933782s" podCreationTimestamp="2025-01-17 13:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-17 13:43:58.381073811 +0000 UTC m=+16.368652410" watchObservedRunningTime="2025-01-17 13:43:58.383933782 +0000 UTC m=+16.371512346" Jan 17 13:44:00.200678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2178723958.mount: Deactivated successfully. Jan 17 13:44:01.067041 containerd[1493]: time="2025-01-17T13:44:01.066949631Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:01.068620 containerd[1493]: time="2025-01-17T13:44:01.068567074Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764289" Jan 17 13:44:01.070680 containerd[1493]: time="2025-01-17T13:44:01.070222116Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:01.074399 containerd[1493]: time="2025-01-17T13:44:01.074331360Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:01.076616 containerd[1493]: time="2025-01-17T13:44:01.076523764Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 3.059881241s" Jan 17 13:44:01.076711 containerd[1493]: time="2025-01-17T13:44:01.076622522Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 17 13:44:01.080561 containerd[1493]: time="2025-01-17T13:44:01.080378531Z" level=info msg="CreateContainer within sandbox \"d159e412f05a9341f0c94300fc046e565a8d4ecd064c352e3d46740bae535654\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 17 13:44:01.116587 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount465829005.mount: Deactivated successfully. Jan 17 13:44:01.127863 containerd[1493]: time="2025-01-17T13:44:01.127821112Z" level=info msg="CreateContainer within sandbox \"d159e412f05a9341f0c94300fc046e565a8d4ecd064c352e3d46740bae535654\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"91905e02c8c579effc3bc4318331a3f74a52cb7cc8523f01d783b6bbce638d0c\"" Jan 17 13:44:01.129695 containerd[1493]: time="2025-01-17T13:44:01.129464552Z" level=info msg="StartContainer for \"91905e02c8c579effc3bc4318331a3f74a52cb7cc8523f01d783b6bbce638d0c\"" Jan 17 13:44:01.178421 systemd[1]: Started cri-containerd-91905e02c8c579effc3bc4318331a3f74a52cb7cc8523f01d783b6bbce638d0c.scope - libcontainer container 91905e02c8c579effc3bc4318331a3f74a52cb7cc8523f01d783b6bbce638d0c. Jan 17 13:44:01.224977 containerd[1493]: time="2025-01-17T13:44:01.224858461Z" level=info msg="StartContainer for \"91905e02c8c579effc3bc4318331a3f74a52cb7cc8523f01d783b6bbce638d0c\" returns successfully" Jan 17 13:44:04.604361 kubelet[2728]: I0117 13:44:04.601739 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-sm6cm" podStartSLOduration=4.538845455 podStartE2EDuration="7.601690548s" podCreationTimestamp="2025-01-17 13:43:57 +0000 UTC" firstStartedPulling="2025-01-17 13:43:58.01512187 +0000 UTC m=+16.002700422" lastFinishedPulling="2025-01-17 13:44:01.077966951 +0000 UTC m=+19.065545515" observedRunningTime="2025-01-17 13:44:01.365121927 +0000 UTC m=+19.352700497" watchObservedRunningTime="2025-01-17 13:44:04.601690548 +0000 UTC m=+22.589269114" Jan 17 13:44:04.607593 kubelet[2728]: I0117 13:44:04.604658 2728 topology_manager.go:215] "Topology Admit Handler" podUID="0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e" podNamespace="calico-system" podName="calico-typha-549dc69985-qcz2l" Jan 17 13:44:04.624292 systemd[1]: Created slice kubepods-besteffort-pod0f6a0aee_24fc_4d99_8ffc_cd83e4c6c39e.slice - libcontainer container kubepods-besteffort-pod0f6a0aee_24fc_4d99_8ffc_cd83e4c6c39e.slice. Jan 17 13:44:04.629306 kubelet[2728]: W0117 13:44:04.628888 2728 reflector.go:547] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:04.629306 kubelet[2728]: E0117 13:44:04.628990 2728 reflector.go:150] object-"calico-system"/"tigera-ca-bundle": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:04.630224 kubelet[2728]: W0117 13:44:04.629791 2728 reflector.go:547] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:04.630224 kubelet[2728]: E0117 13:44:04.629852 2728 reflector.go:150] object-"calico-system"/"typha-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:04.632113 kubelet[2728]: W0117 13:44:04.630634 2728 reflector.go:547] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:04.632113 kubelet[2728]: E0117 13:44:04.630671 2728 reflector.go:150] object-"calico-system"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:04.674781 kubelet[2728]: I0117 13:44:04.674715 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmqdk\" (UniqueName: \"kubernetes.io/projected/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-kube-api-access-zmqdk\") pod \"calico-typha-549dc69985-qcz2l\" (UID: \"0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e\") " pod="calico-system/calico-typha-549dc69985-qcz2l" Jan 17 13:44:04.675027 kubelet[2728]: I0117 13:44:04.674802 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-tigera-ca-bundle\") pod \"calico-typha-549dc69985-qcz2l\" (UID: \"0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e\") " pod="calico-system/calico-typha-549dc69985-qcz2l" Jan 17 13:44:04.675027 kubelet[2728]: I0117 13:44:04.674845 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-typha-certs\") pod \"calico-typha-549dc69985-qcz2l\" (UID: \"0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e\") " pod="calico-system/calico-typha-549dc69985-qcz2l" Jan 17 13:44:04.747655 kubelet[2728]: I0117 13:44:04.747569 2728 topology_manager.go:215] "Topology Admit Handler" podUID="1b3859c4-a241-4cf9-8c29-3c1172341767" podNamespace="calico-system" podName="calico-node-c7vcc" Jan 17 13:44:04.762687 systemd[1]: Created slice kubepods-besteffort-pod1b3859c4_a241_4cf9_8c29_3c1172341767.slice - libcontainer container kubepods-besteffort-pod1b3859c4_a241_4cf9_8c29_3c1172341767.slice. Jan 17 13:44:04.775295 kubelet[2728]: I0117 13:44:04.775247 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-cni-log-dir\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775434 kubelet[2728]: I0117 13:44:04.775303 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3859c4-a241-4cf9-8c29-3c1172341767-tigera-ca-bundle\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775434 kubelet[2728]: I0117 13:44:04.775347 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1b3859c4-a241-4cf9-8c29-3c1172341767-node-certs\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775434 kubelet[2728]: I0117 13:44:04.775382 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-var-run-calico\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775434 kubelet[2728]: I0117 13:44:04.775412 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-cni-bin-dir\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775864 kubelet[2728]: I0117 13:44:04.775460 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-cni-net-dir\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775864 kubelet[2728]: I0117 13:44:04.775501 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-xtables-lock\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775864 kubelet[2728]: I0117 13:44:04.775545 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-policysync\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775864 kubelet[2728]: I0117 13:44:04.775570 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-flexvol-driver-host\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.775864 kubelet[2728]: I0117 13:44:04.775626 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-lib-modules\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.776130 kubelet[2728]: I0117 13:44:04.775698 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj5s\" (UniqueName: \"kubernetes.io/projected/1b3859c4-a241-4cf9-8c29-3c1172341767-kube-api-access-rnj5s\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.776130 kubelet[2728]: I0117 13:44:04.775739 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1b3859c4-a241-4cf9-8c29-3c1172341767-var-lib-calico\") pod \"calico-node-c7vcc\" (UID: \"1b3859c4-a241-4cf9-8c29-3c1172341767\") " pod="calico-system/calico-node-c7vcc" Jan 17 13:44:04.888710 kubelet[2728]: E0117 13:44:04.888216 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:04.888710 kubelet[2728]: W0117 13:44:04.888265 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:04.888710 kubelet[2728]: E0117 13:44:04.888325 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:04.978338 kubelet[2728]: E0117 13:44:04.978083 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:04.978338 kubelet[2728]: W0117 13:44:04.978124 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:04.978338 kubelet[2728]: E0117 13:44:04.978160 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:04.980258 kubelet[2728]: E0117 13:44:04.980206 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:04.980258 kubelet[2728]: W0117 13:44:04.980242 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:04.980258 kubelet[2728]: E0117 13:44:04.980261 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:04.980745 kubelet[2728]: E0117 13:44:04.980709 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:04.980745 kubelet[2728]: W0117 13:44:04.980736 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:04.980880 kubelet[2728]: E0117 13:44:04.980772 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:04.981307 kubelet[2728]: E0117 13:44:04.981175 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:04.981307 kubelet[2728]: W0117 13:44:04.981232 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:04.981307 kubelet[2728]: E0117 13:44:04.981250 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:04.981806 kubelet[2728]: E0117 13:44:04.981782 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:04.981806 kubelet[2728]: W0117 13:44:04.981804 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:04.981938 kubelet[2728]: E0117 13:44:04.981849 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.001221 kubelet[2728]: I0117 13:44:05.001111 2728 topology_manager.go:215] "Topology Admit Handler" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" podNamespace="calico-system" podName="csi-node-driver-9fwmg" Jan 17 13:44:05.001690 kubelet[2728]: E0117 13:44:05.001654 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:05.081430 kubelet[2728]: E0117 13:44:05.081351 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.081430 kubelet[2728]: W0117 13:44:05.081425 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.081800 kubelet[2728]: E0117 13:44:05.081519 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.082421 kubelet[2728]: E0117 13:44:05.082085 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.082421 kubelet[2728]: W0117 13:44:05.082109 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.082421 kubelet[2728]: E0117 13:44:05.082144 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.082609 kubelet[2728]: E0117 13:44:05.082469 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.082609 kubelet[2728]: W0117 13:44:05.082485 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.082609 kubelet[2728]: E0117 13:44:05.082501 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.082948 kubelet[2728]: E0117 13:44:05.082830 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.082948 kubelet[2728]: W0117 13:44:05.082846 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.082948 kubelet[2728]: E0117 13:44:05.082862 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.083899 kubelet[2728]: E0117 13:44:05.083823 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.083899 kubelet[2728]: W0117 13:44:05.083859 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.084072 kubelet[2728]: E0117 13:44:05.083905 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.084157 kubelet[2728]: I0117 13:44:05.084125 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/41e59e9c-f5c4-48af-a614-7a43cf86d00d-varrun\") pod \"csi-node-driver-9fwmg\" (UID: \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\") " pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:05.084738 kubelet[2728]: E0117 13:44:05.084710 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.084738 kubelet[2728]: W0117 13:44:05.084733 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.084886 kubelet[2728]: E0117 13:44:05.084793 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.086110 kubelet[2728]: E0117 13:44:05.086067 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.086110 kubelet[2728]: W0117 13:44:05.086090 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.086930 kubelet[2728]: E0117 13:44:05.086290 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.087017 kubelet[2728]: E0117 13:44:05.086963 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.087017 kubelet[2728]: W0117 13:44:05.086992 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.088275 kubelet[2728]: E0117 13:44:05.088243 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.088515 kubelet[2728]: E0117 13:44:05.088489 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.088515 kubelet[2728]: W0117 13:44:05.088511 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.088653 kubelet[2728]: E0117 13:44:05.088607 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.089220 kubelet[2728]: E0117 13:44:05.088887 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.089220 kubelet[2728]: W0117 13:44:05.088909 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.089220 kubelet[2728]: E0117 13:44:05.089016 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.089875 kubelet[2728]: E0117 13:44:05.089839 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.089978 kubelet[2728]: W0117 13:44:05.089877 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.089978 kubelet[2728]: E0117 13:44:05.089929 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.090550 kubelet[2728]: E0117 13:44:05.090519 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.090550 kubelet[2728]: W0117 13:44:05.090543 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.090684 kubelet[2728]: E0117 13:44:05.090592 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.091503 kubelet[2728]: E0117 13:44:05.091476 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.091503 kubelet[2728]: W0117 13:44:05.091499 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.091647 kubelet[2728]: E0117 13:44:05.091536 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.092861 kubelet[2728]: E0117 13:44:05.092309 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.092861 kubelet[2728]: W0117 13:44:05.092332 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.092861 kubelet[2728]: E0117 13:44:05.092365 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.092861 kubelet[2728]: E0117 13:44:05.092668 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.092861 kubelet[2728]: W0117 13:44:05.092682 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.092861 kubelet[2728]: E0117 13:44:05.092704 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.093488 kubelet[2728]: E0117 13:44:05.093460 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.093488 kubelet[2728]: W0117 13:44:05.093483 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.093627 kubelet[2728]: E0117 13:44:05.093510 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.094406 kubelet[2728]: E0117 13:44:05.094332 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.094406 kubelet[2728]: W0117 13:44:05.094365 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.094660 kubelet[2728]: E0117 13:44:05.094568 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.095278 kubelet[2728]: E0117 13:44:05.095131 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.095278 kubelet[2728]: W0117 13:44:05.095161 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.095278 kubelet[2728]: E0117 13:44:05.095228 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.096299 kubelet[2728]: E0117 13:44:05.096098 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.096299 kubelet[2728]: W0117 13:44:05.096120 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.096299 kubelet[2728]: E0117 13:44:05.096137 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.096886 kubelet[2728]: E0117 13:44:05.096647 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.096886 kubelet[2728]: W0117 13:44:05.096672 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.096886 kubelet[2728]: E0117 13:44:05.096690 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.097643 kubelet[2728]: E0117 13:44:05.097346 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.097643 kubelet[2728]: W0117 13:44:05.097388 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.097643 kubelet[2728]: E0117 13:44:05.097407 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.098334 kubelet[2728]: E0117 13:44:05.098302 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.098334 kubelet[2728]: W0117 13:44:05.098325 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.098462 kubelet[2728]: E0117 13:44:05.098343 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.099468 kubelet[2728]: E0117 13:44:05.098901 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.099468 kubelet[2728]: W0117 13:44:05.098923 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.099468 kubelet[2728]: E0117 13:44:05.098940 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.100136 kubelet[2728]: E0117 13:44:05.100066 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.100136 kubelet[2728]: W0117 13:44:05.100088 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.100136 kubelet[2728]: E0117 13:44:05.100105 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.100766 kubelet[2728]: E0117 13:44:05.100488 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.100766 kubelet[2728]: W0117 13:44:05.100505 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.100766 kubelet[2728]: E0117 13:44:05.100521 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.100902 kubelet[2728]: E0117 13:44:05.100891 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.100976 kubelet[2728]: W0117 13:44:05.100906 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.100976 kubelet[2728]: E0117 13:44:05.100921 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.102293 kubelet[2728]: E0117 13:44:05.102265 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.102293 kubelet[2728]: W0117 13:44:05.102288 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.102442 kubelet[2728]: E0117 13:44:05.102307 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.103333 kubelet[2728]: E0117 13:44:05.103283 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.103333 kubelet[2728]: W0117 13:44:05.103324 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.104273 kubelet[2728]: E0117 13:44:05.103343 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.206282 kubelet[2728]: E0117 13:44:05.205622 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.206527 kubelet[2728]: W0117 13:44:05.206386 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.206527 kubelet[2728]: E0117 13:44:05.206433 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.209444 kubelet[2728]: E0117 13:44:05.206793 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.209444 kubelet[2728]: W0117 13:44:05.206816 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.209444 kubelet[2728]: E0117 13:44:05.206833 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.209444 kubelet[2728]: E0117 13:44:05.207101 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.209444 kubelet[2728]: W0117 13:44:05.207115 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.209444 kubelet[2728]: E0117 13:44:05.207141 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.209444 kubelet[2728]: I0117 13:44:05.207225 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41e59e9c-f5c4-48af-a614-7a43cf86d00d-kubelet-dir\") pod \"csi-node-driver-9fwmg\" (UID: \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\") " pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:05.209444 kubelet[2728]: E0117 13:44:05.207504 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.209444 kubelet[2728]: W0117 13:44:05.207530 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210010 kubelet[2728]: E0117 13:44:05.207546 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210010 kubelet[2728]: I0117 13:44:05.207571 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/41e59e9c-f5c4-48af-a614-7a43cf86d00d-registration-dir\") pod \"csi-node-driver-9fwmg\" (UID: \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\") " pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:05.210010 kubelet[2728]: E0117 13:44:05.207866 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210010 kubelet[2728]: W0117 13:44:05.207882 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210010 kubelet[2728]: E0117 13:44:05.207897 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210010 kubelet[2728]: E0117 13:44:05.208223 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210010 kubelet[2728]: W0117 13:44:05.208252 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210010 kubelet[2728]: E0117 13:44:05.208270 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210459 kubelet[2728]: I0117 13:44:05.208308 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/41e59e9c-f5c4-48af-a614-7a43cf86d00d-socket-dir\") pod \"csi-node-driver-9fwmg\" (UID: \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\") " pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:05.210459 kubelet[2728]: E0117 13:44:05.208584 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210459 kubelet[2728]: W0117 13:44:05.208602 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210459 kubelet[2728]: E0117 13:44:05.208618 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210459 kubelet[2728]: E0117 13:44:05.208885 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210459 kubelet[2728]: W0117 13:44:05.208913 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210459 kubelet[2728]: E0117 13:44:05.208928 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210459 kubelet[2728]: E0117 13:44:05.209230 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210459 kubelet[2728]: W0117 13:44:05.209245 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210915 kubelet[2728]: E0117 13:44:05.209262 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210915 kubelet[2728]: E0117 13:44:05.209535 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210915 kubelet[2728]: W0117 13:44:05.209551 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210915 kubelet[2728]: E0117 13:44:05.209567 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210915 kubelet[2728]: I0117 13:44:05.209600 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc598\" (UniqueName: \"kubernetes.io/projected/41e59e9c-f5c4-48af-a614-7a43cf86d00d-kube-api-access-lc598\") pod \"csi-node-driver-9fwmg\" (UID: \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\") " pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:05.210915 kubelet[2728]: E0117 13:44:05.209910 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.210915 kubelet[2728]: W0117 13:44:05.209926 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.210915 kubelet[2728]: E0117 13:44:05.209942 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.210915 kubelet[2728]: E0117 13:44:05.210242 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.215639 kubelet[2728]: W0117 13:44:05.210258 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.215639 kubelet[2728]: E0117 13:44:05.210273 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.215639 kubelet[2728]: E0117 13:44:05.210562 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.215639 kubelet[2728]: W0117 13:44:05.210578 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.215639 kubelet[2728]: E0117 13:44:05.210602 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.215639 kubelet[2728]: E0117 13:44:05.210852 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.215639 kubelet[2728]: W0117 13:44:05.210867 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.215639 kubelet[2728]: E0117 13:44:05.210882 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.215639 kubelet[2728]: E0117 13:44:05.211141 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.215639 kubelet[2728]: W0117 13:44:05.211155 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.211225 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.211479 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.216421 kubelet[2728]: W0117 13:44:05.211507 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.211526 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.211785 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.216421 kubelet[2728]: W0117 13:44:05.211799 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.211815 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.212717 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.216421 kubelet[2728]: W0117 13:44:05.212737 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216421 kubelet[2728]: E0117 13:44:05.212753 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.213019 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.216989 kubelet[2728]: W0117 13:44:05.213034 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.213069 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.213482 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.216989 kubelet[2728]: W0117 13:44:05.213497 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.213514 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.213835 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.216989 kubelet[2728]: W0117 13:44:05.213850 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.213865 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.216989 kubelet[2728]: E0117 13:44:05.214109 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.217573 kubelet[2728]: W0117 13:44:05.214123 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.217573 kubelet[2728]: E0117 13:44:05.214139 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.311515 kubelet[2728]: E0117 13:44:05.311464 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.312389 kubelet[2728]: W0117 13:44:05.311753 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.312389 kubelet[2728]: E0117 13:44:05.311805 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.313023 kubelet[2728]: E0117 13:44:05.312864 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.313023 kubelet[2728]: W0117 13:44:05.312885 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.313023 kubelet[2728]: E0117 13:44:05.312914 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.315709 kubelet[2728]: E0117 13:44:05.314561 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.315709 kubelet[2728]: W0117 13:44:05.314584 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.315709 kubelet[2728]: E0117 13:44:05.314602 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.315709 kubelet[2728]: E0117 13:44:05.315369 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.315709 kubelet[2728]: W0117 13:44:05.315386 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.315709 kubelet[2728]: E0117 13:44:05.315403 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.316326 kubelet[2728]: E0117 13:44:05.316298 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.316326 kubelet[2728]: W0117 13:44:05.316321 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.316937 kubelet[2728]: E0117 13:44:05.316347 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.317114 kubelet[2728]: E0117 13:44:05.317097 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.317469 kubelet[2728]: W0117 13:44:05.317114 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.317469 kubelet[2728]: E0117 13:44:05.317170 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.317939 kubelet[2728]: E0117 13:44:05.317516 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.317939 kubelet[2728]: W0117 13:44:05.317532 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.317939 kubelet[2728]: E0117 13:44:05.317806 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.317939 kubelet[2728]: E0117 13:44:05.317872 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.317939 kubelet[2728]: W0117 13:44:05.317892 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.318880 kubelet[2728]: E0117 13:44:05.318169 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.318880 kubelet[2728]: W0117 13:44:05.318218 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.318880 kubelet[2728]: E0117 13:44:05.318236 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.318880 kubelet[2728]: E0117 13:44:05.318326 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.319597 kubelet[2728]: E0117 13:44:05.319278 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.319597 kubelet[2728]: W0117 13:44:05.319301 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.319597 kubelet[2728]: E0117 13:44:05.319381 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.320081 kubelet[2728]: E0117 13:44:05.320043 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.320296 kubelet[2728]: W0117 13:44:05.320215 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.320646 kubelet[2728]: E0117 13:44:05.320434 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.320979 kubelet[2728]: E0117 13:44:05.320943 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.321239 kubelet[2728]: W0117 13:44:05.321079 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.322405 kubelet[2728]: E0117 13:44:05.321123 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.322627 kubelet[2728]: E0117 13:44:05.322602 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.322627 kubelet[2728]: W0117 13:44:05.322627 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.322754 kubelet[2728]: E0117 13:44:05.322647 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.323493 kubelet[2728]: E0117 13:44:05.323464 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.324285 kubelet[2728]: W0117 13:44:05.324255 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.324285 kubelet[2728]: E0117 13:44:05.324288 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.325967 kubelet[2728]: E0117 13:44:05.325941 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.325967 kubelet[2728]: W0117 13:44:05.325967 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.326237 kubelet[2728]: E0117 13:44:05.325985 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.327046 kubelet[2728]: E0117 13:44:05.327016 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.327046 kubelet[2728]: W0117 13:44:05.327042 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.327264 kubelet[2728]: E0117 13:44:05.327062 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.328998 kubelet[2728]: E0117 13:44:05.328968 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.328998 kubelet[2728]: W0117 13:44:05.328993 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.329138 kubelet[2728]: E0117 13:44:05.329025 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.330020 kubelet[2728]: E0117 13:44:05.329994 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.330020 kubelet[2728]: W0117 13:44:05.330018 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.330134 kubelet[2728]: E0117 13:44:05.330037 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.330735 kubelet[2728]: E0117 13:44:05.330569 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.330735 kubelet[2728]: W0117 13:44:05.330593 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.330735 kubelet[2728]: E0117 13:44:05.330611 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.332422 kubelet[2728]: E0117 13:44:05.332396 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.332422 kubelet[2728]: W0117 13:44:05.332421 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.332665 kubelet[2728]: E0117 13:44:05.332447 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.333466 kubelet[2728]: E0117 13:44:05.333233 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.333466 kubelet[2728]: W0117 13:44:05.333258 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.333466 kubelet[2728]: E0117 13:44:05.333276 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.334388 kubelet[2728]: E0117 13:44:05.334104 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.334388 kubelet[2728]: W0117 13:44:05.334127 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.334388 kubelet[2728]: E0117 13:44:05.334145 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.335215 kubelet[2728]: E0117 13:44:05.334791 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.335215 kubelet[2728]: W0117 13:44:05.334812 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.335215 kubelet[2728]: E0117 13:44:05.334830 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.335465 kubelet[2728]: E0117 13:44:05.335440 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.335465 kubelet[2728]: W0117 13:44:05.335462 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.335571 kubelet[2728]: E0117 13:44:05.335481 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.336560 kubelet[2728]: E0117 13:44:05.336511 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.336560 kubelet[2728]: W0117 13:44:05.336540 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.336697 kubelet[2728]: E0117 13:44:05.336558 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.434462 kubelet[2728]: E0117 13:44:05.434231 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.434462 kubelet[2728]: W0117 13:44:05.434267 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.434462 kubelet[2728]: E0117 13:44:05.434298 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.435262 kubelet[2728]: E0117 13:44:05.435005 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.435262 kubelet[2728]: W0117 13:44:05.435031 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.435262 kubelet[2728]: E0117 13:44:05.435048 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.435651 kubelet[2728]: E0117 13:44:05.435618 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.435772 kubelet[2728]: W0117 13:44:05.435751 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.436377 kubelet[2728]: E0117 13:44:05.435883 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.436906 kubelet[2728]: E0117 13:44:05.436711 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.436906 kubelet[2728]: W0117 13:44:05.436730 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.436906 kubelet[2728]: E0117 13:44:05.436747 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.437161 kubelet[2728]: E0117 13:44:05.437141 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.437293 kubelet[2728]: W0117 13:44:05.437273 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.437626 kubelet[2728]: E0117 13:44:05.437399 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.437919 kubelet[2728]: E0117 13:44:05.437813 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.437919 kubelet[2728]: W0117 13:44:05.437841 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.437919 kubelet[2728]: E0117 13:44:05.437858 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.540621 kubelet[2728]: E0117 13:44:05.540448 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.540621 kubelet[2728]: W0117 13:44:05.540485 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.540621 kubelet[2728]: E0117 13:44:05.540543 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.542459 kubelet[2728]: E0117 13:44:05.542419 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.542459 kubelet[2728]: W0117 13:44:05.542446 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.542607 kubelet[2728]: E0117 13:44:05.542465 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.542918 kubelet[2728]: E0117 13:44:05.542725 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.542918 kubelet[2728]: W0117 13:44:05.542747 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.542918 kubelet[2728]: E0117 13:44:05.542763 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.544869 kubelet[2728]: E0117 13:44:05.544690 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.544869 kubelet[2728]: W0117 13:44:05.544712 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.544869 kubelet[2728]: E0117 13:44:05.544730 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.545595 kubelet[2728]: E0117 13:44:05.545567 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.545595 kubelet[2728]: W0117 13:44:05.545589 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.545756 kubelet[2728]: E0117 13:44:05.545607 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.547176 kubelet[2728]: E0117 13:44:05.546671 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.547176 kubelet[2728]: W0117 13:44:05.546694 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.547176 kubelet[2728]: E0117 13:44:05.546711 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.647631 kubelet[2728]: E0117 13:44:05.647539 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.647631 kubelet[2728]: W0117 13:44:05.647624 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.648415 kubelet[2728]: E0117 13:44:05.647680 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.648415 kubelet[2728]: E0117 13:44:05.648022 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.648415 kubelet[2728]: W0117 13:44:05.648037 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.648415 kubelet[2728]: E0117 13:44:05.648052 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.648616 kubelet[2728]: E0117 13:44:05.648477 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.648616 kubelet[2728]: W0117 13:44:05.648492 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.648616 kubelet[2728]: E0117 13:44:05.648508 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.649594 kubelet[2728]: E0117 13:44:05.649027 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.649594 kubelet[2728]: W0117 13:44:05.649048 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.649594 kubelet[2728]: E0117 13:44:05.649071 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.649594 kubelet[2728]: E0117 13:44:05.649540 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.649594 kubelet[2728]: W0117 13:44:05.649554 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.650206 kubelet[2728]: E0117 13:44:05.650131 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.650560 kubelet[2728]: E0117 13:44:05.650534 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.650560 kubelet[2728]: W0117 13:44:05.650556 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.650674 kubelet[2728]: E0117 13:44:05.650572 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.752597 kubelet[2728]: E0117 13:44:05.752527 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.752597 kubelet[2728]: W0117 13:44:05.752577 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.752952 kubelet[2728]: E0117 13:44:05.752622 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.753042 kubelet[2728]: E0117 13:44:05.753019 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.753042 kubelet[2728]: W0117 13:44:05.753041 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.753229 kubelet[2728]: E0117 13:44:05.753070 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.754116 kubelet[2728]: E0117 13:44:05.753665 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.754116 kubelet[2728]: W0117 13:44:05.753687 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.754116 kubelet[2728]: E0117 13:44:05.753713 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.754454 kubelet[2728]: E0117 13:44:05.754277 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.754454 kubelet[2728]: W0117 13:44:05.754292 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.754454 kubelet[2728]: E0117 13:44:05.754323 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.754713 kubelet[2728]: E0117 13:44:05.754625 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.754713 kubelet[2728]: W0117 13:44:05.754648 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.754713 kubelet[2728]: E0117 13:44:05.754664 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.755293 kubelet[2728]: E0117 13:44:05.755260 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.755293 kubelet[2728]: W0117 13:44:05.755286 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.755451 kubelet[2728]: E0117 13:44:05.755304 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.777152 kubelet[2728]: E0117 13:44:05.777049 2728 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.777404 kubelet[2728]: E0117 13:44:05.777314 2728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-tigera-ca-bundle podName:0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e nodeName:}" failed. No retries permitted until 2025-01-17 13:44:06.277253793 +0000 UTC m=+24.264832344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-tigera-ca-bundle") pod "calico-typha-549dc69985-qcz2l" (UID: "0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e") : failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.790848 kubelet[2728]: E0117 13:44:05.790669 2728 secret.go:194] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Jan 17 13:44:05.790848 kubelet[2728]: E0117 13:44:05.790772 2728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-typha-certs podName:0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e nodeName:}" failed. No retries permitted until 2025-01-17 13:44:06.290751932 +0000 UTC m=+24.278330483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-typha-certs") pod "calico-typha-549dc69985-qcz2l" (UID: "0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e") : failed to sync secret cache: timed out waiting for the condition Jan 17 13:44:05.804208 kubelet[2728]: E0117 13:44:05.801760 2728 projected.go:294] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.804208 kubelet[2728]: E0117 13:44:05.802261 2728 projected.go:200] Error preparing data for projected volume kube-api-access-zmqdk for pod calico-system/calico-typha-549dc69985-qcz2l: failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.804208 kubelet[2728]: E0117 13:44:05.802360 2728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-kube-api-access-zmqdk podName:0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e nodeName:}" failed. No retries permitted until 2025-01-17 13:44:06.302341383 +0000 UTC m=+24.289919935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zmqdk" (UniqueName: "kubernetes.io/projected/0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e-kube-api-access-zmqdk") pod "calico-typha-549dc69985-qcz2l" (UID: "0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e") : failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.855241 kubelet[2728]: E0117 13:44:05.855133 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.858671 kubelet[2728]: W0117 13:44:05.857472 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.858671 kubelet[2728]: E0117 13:44:05.857540 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.860142 kubelet[2728]: E0117 13:44:05.859492 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.860142 kubelet[2728]: W0117 13:44:05.859517 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.860346 kubelet[2728]: E0117 13:44:05.860252 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.862124 kubelet[2728]: E0117 13:44:05.861330 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.862124 kubelet[2728]: W0117 13:44:05.861367 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.862124 kubelet[2728]: E0117 13:44:05.861393 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.862124 kubelet[2728]: E0117 13:44:05.861913 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.862124 kubelet[2728]: W0117 13:44:05.861929 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.862124 kubelet[2728]: E0117 13:44:05.861945 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.865213 kubelet[2728]: E0117 13:44:05.864679 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.865213 kubelet[2728]: W0117 13:44:05.865002 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.865213 kubelet[2728]: E0117 13:44:05.865028 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.865834 kubelet[2728]: E0117 13:44:05.865676 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.865834 kubelet[2728]: W0117 13:44:05.865716 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.865834 kubelet[2728]: E0117 13:44:05.865737 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.878284 kubelet[2728]: E0117 13:44:05.878172 2728 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.878793 kubelet[2728]: E0117 13:44:05.878766 2728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1b3859c4-a241-4cf9-8c29-3c1172341767-tigera-ca-bundle podName:1b3859c4-a241-4cf9-8c29-3c1172341767 nodeName:}" failed. No retries permitted until 2025-01-17 13:44:06.378405323 +0000 UTC m=+24.365983881 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/1b3859c4-a241-4cf9-8c29-3c1172341767-tigera-ca-bundle") pod "calico-node-c7vcc" (UID: "1b3859c4-a241-4cf9-8c29-3c1172341767") : failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:05.968434 kubelet[2728]: E0117 13:44:05.968276 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.968434 kubelet[2728]: W0117 13:44:05.968319 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.968434 kubelet[2728]: E0117 13:44:05.968356 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.970215 kubelet[2728]: E0117 13:44:05.969960 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.970215 kubelet[2728]: W0117 13:44:05.969983 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.970215 kubelet[2728]: E0117 13:44:05.970000 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.970799 kubelet[2728]: E0117 13:44:05.970727 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.970799 kubelet[2728]: W0117 13:44:05.970797 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.970934 kubelet[2728]: E0117 13:44:05.970816 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:05.971318 kubelet[2728]: E0117 13:44:05.971290 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:05.971409 kubelet[2728]: W0117 13:44:05.971331 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:05.971409 kubelet[2728]: E0117 13:44:05.971351 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.073400 kubelet[2728]: E0117 13:44:06.073268 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.074745 kubelet[2728]: W0117 13:44:06.074013 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.074745 kubelet[2728]: E0117 13:44:06.074060 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.077756 kubelet[2728]: E0117 13:44:06.077404 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.077756 kubelet[2728]: W0117 13:44:06.077437 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.077756 kubelet[2728]: E0117 13:44:06.077467 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.078676 kubelet[2728]: E0117 13:44:06.078455 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.078676 kubelet[2728]: W0117 13:44:06.078494 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.078676 kubelet[2728]: E0117 13:44:06.078512 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.079301 kubelet[2728]: E0117 13:44:06.079160 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.079301 kubelet[2728]: W0117 13:44:06.079207 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.079301 kubelet[2728]: E0117 13:44:06.079227 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.180313 kubelet[2728]: E0117 13:44:06.180224 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.180313 kubelet[2728]: W0117 13:44:06.180262 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.180313 kubelet[2728]: E0117 13:44:06.180293 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.180845 kubelet[2728]: E0117 13:44:06.180690 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.180845 kubelet[2728]: W0117 13:44:06.180704 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.180845 kubelet[2728]: E0117 13:44:06.180742 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.181234 kubelet[2728]: E0117 13:44:06.180986 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.181234 kubelet[2728]: W0117 13:44:06.180999 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.181234 kubelet[2728]: E0117 13:44:06.181019 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.182420 kubelet[2728]: E0117 13:44:06.181890 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.182420 kubelet[2728]: W0117 13:44:06.182347 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.182420 kubelet[2728]: E0117 13:44:06.182377 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.284322 kubelet[2728]: E0117 13:44:06.284255 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.284816 kubelet[2728]: W0117 13:44:06.284348 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.284816 kubelet[2728]: E0117 13:44:06.284380 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.284816 kubelet[2728]: E0117 13:44:06.284802 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.285005 kubelet[2728]: W0117 13:44:06.284822 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.285005 kubelet[2728]: E0117 13:44:06.284837 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.285750 kubelet[2728]: E0117 13:44:06.285567 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.285750 kubelet[2728]: W0117 13:44:06.285589 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.285750 kubelet[2728]: E0117 13:44:06.285608 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.287531 kubelet[2728]: E0117 13:44:06.287261 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.287531 kubelet[2728]: W0117 13:44:06.287293 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.287531 kubelet[2728]: E0117 13:44:06.287331 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.288185 kubelet[2728]: E0117 13:44:06.287965 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.288185 kubelet[2728]: W0117 13:44:06.287986 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.288185 kubelet[2728]: E0117 13:44:06.288011 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.288846 kubelet[2728]: E0117 13:44:06.288578 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.288846 kubelet[2728]: W0117 13:44:06.288598 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.288846 kubelet[2728]: E0117 13:44:06.288614 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.289365 kubelet[2728]: E0117 13:44:06.289081 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.289365 kubelet[2728]: W0117 13:44:06.289099 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.289365 kubelet[2728]: E0117 13:44:06.289145 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.290322 kubelet[2728]: E0117 13:44:06.289916 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.290322 kubelet[2728]: W0117 13:44:06.289946 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.290322 kubelet[2728]: E0117 13:44:06.289963 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.291838 kubelet[2728]: E0117 13:44:06.291807 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.291838 kubelet[2728]: W0117 13:44:06.291830 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.291838 kubelet[2728]: E0117 13:44:06.291855 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.388387 kubelet[2728]: E0117 13:44:06.388358 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.388866 kubelet[2728]: W0117 13:44:06.388640 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.388866 kubelet[2728]: E0117 13:44:06.388678 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.389163 kubelet[2728]: E0117 13:44:06.389125 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.389379 kubelet[2728]: W0117 13:44:06.389278 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.389379 kubelet[2728]: E0117 13:44:06.389326 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.390006 kubelet[2728]: E0117 13:44:06.389915 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.390006 kubelet[2728]: W0117 13:44:06.389934 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.390006 kubelet[2728]: E0117 13:44:06.389963 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.390385 kubelet[2728]: E0117 13:44:06.390348 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.390493 kubelet[2728]: W0117 13:44:06.390385 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.390493 kubelet[2728]: E0117 13:44:06.390425 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.391202 kubelet[2728]: E0117 13:44:06.390877 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.391202 kubelet[2728]: W0117 13:44:06.390898 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.391202 kubelet[2728]: E0117 13:44:06.390924 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.391356 kubelet[2728]: E0117 13:44:06.391240 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.391356 kubelet[2728]: W0117 13:44:06.391256 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.391356 kubelet[2728]: E0117 13:44:06.391272 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.391561 kubelet[2728]: E0117 13:44:06.391521 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.391561 kubelet[2728]: W0117 13:44:06.391543 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.391561 kubelet[2728]: E0117 13:44:06.391558 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.392169 kubelet[2728]: E0117 13:44:06.391845 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.392169 kubelet[2728]: W0117 13:44:06.391860 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.392169 kubelet[2728]: E0117 13:44:06.392109 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.392169 kubelet[2728]: W0117 13:44:06.392122 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.393032 kubelet[2728]: E0117 13:44:06.392385 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.393032 kubelet[2728]: W0117 13:44:06.392410 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.393032 kubelet[2728]: E0117 13:44:06.392466 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.393032 kubelet[2728]: E0117 13:44:06.392693 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.393032 kubelet[2728]: W0117 13:44:06.392709 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.393032 kubelet[2728]: E0117 13:44:06.392731 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.394814 kubelet[2728]: E0117 13:44:06.393113 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.394814 kubelet[2728]: W0117 13:44:06.393149 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.394814 kubelet[2728]: E0117 13:44:06.393166 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.394814 kubelet[2728]: E0117 13:44:06.394062 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.402790 kubelet[2728]: E0117 13:44:06.402759 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.402790 kubelet[2728]: W0117 13:44:06.402783 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.402956 kubelet[2728]: E0117 13:44:06.402813 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.402956 kubelet[2728]: E0117 13:44:06.402849 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.406274 kubelet[2728]: E0117 13:44:06.404249 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.406274 kubelet[2728]: W0117 13:44:06.404275 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.406274 kubelet[2728]: E0117 13:44:06.404294 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.406274 kubelet[2728]: E0117 13:44:06.405365 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.406274 kubelet[2728]: W0117 13:44:06.405379 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.406274 kubelet[2728]: E0117 13:44:06.405396 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.409954 kubelet[2728]: E0117 13:44:06.408251 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.409954 kubelet[2728]: W0117 13:44:06.408274 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.409954 kubelet[2728]: E0117 13:44:06.408573 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.409954 kubelet[2728]: W0117 13:44:06.408587 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.409954 kubelet[2728]: E0117 13:44:06.408603 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.409954 kubelet[2728]: E0117 13:44:06.408633 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.410680 kubelet[2728]: E0117 13:44:06.410652 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:06.410680 kubelet[2728]: W0117 13:44:06.410675 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:06.410771 kubelet[2728]: E0117 13:44:06.410692 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:06.437269 containerd[1493]: time="2025-01-17T13:44:06.436531189Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-549dc69985-qcz2l,Uid:0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e,Namespace:calico-system,Attempt:0,}" Jan 17 13:44:06.496280 containerd[1493]: time="2025-01-17T13:44:06.495846099Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:06.496280 containerd[1493]: time="2025-01-17T13:44:06.495996558Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:06.496280 containerd[1493]: time="2025-01-17T13:44:06.496021002Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:06.498043 containerd[1493]: time="2025-01-17T13:44:06.497930587Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:06.544423 systemd[1]: Started cri-containerd-21d4baf99076af6535d0fae9f6aba2978f1de0222f2032afcf380a80798a7e69.scope - libcontainer container 21d4baf99076af6535d0fae9f6aba2978f1de0222f2032afcf380a80798a7e69. Jan 17 13:44:06.571288 containerd[1493]: time="2025-01-17T13:44:06.570329181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-c7vcc,Uid:1b3859c4-a241-4cf9-8c29-3c1172341767,Namespace:calico-system,Attempt:0,}" Jan 17 13:44:06.691791 containerd[1493]: time="2025-01-17T13:44:06.691109233Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:06.691791 containerd[1493]: time="2025-01-17T13:44:06.691747555Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:06.695593 containerd[1493]: time="2025-01-17T13:44:06.694522757Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:06.696379 containerd[1493]: time="2025-01-17T13:44:06.696016651Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:06.700522 containerd[1493]: time="2025-01-17T13:44:06.700449027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-549dc69985-qcz2l,Uid:0f6a0aee-24fc-4d99-8ffc-cd83e4c6c39e,Namespace:calico-system,Attempt:0,} returns sandbox id \"21d4baf99076af6535d0fae9f6aba2978f1de0222f2032afcf380a80798a7e69\"" Jan 17 13:44:06.707505 containerd[1493]: time="2025-01-17T13:44:06.707160206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 17 13:44:06.742494 systemd[1]: Started cri-containerd-924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e.scope - libcontainer container 924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e. Jan 17 13:44:06.784099 containerd[1493]: time="2025-01-17T13:44:06.783931413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-c7vcc,Uid:1b3859c4-a241-4cf9-8c29-3c1172341767,Namespace:calico-system,Attempt:0,} returns sandbox id \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\"" Jan 17 13:44:07.282711 kubelet[2728]: E0117 13:44:07.282592 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:08.392140 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2169214071.mount: Deactivated successfully. Jan 17 13:44:09.283331 kubelet[2728]: E0117 13:44:09.283207 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:09.781593 containerd[1493]: time="2025-01-17T13:44:09.781456250Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:09.798518 containerd[1493]: time="2025-01-17T13:44:09.798411762Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 17 13:44:09.799770 containerd[1493]: time="2025-01-17T13:44:09.799686788Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:09.804321 containerd[1493]: time="2025-01-17T13:44:09.804243879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:09.805917 containerd[1493]: time="2025-01-17T13:44:09.805241748Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 3.097938234s" Jan 17 13:44:09.805917 containerd[1493]: time="2025-01-17T13:44:09.805288673Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 17 13:44:09.808904 containerd[1493]: time="2025-01-17T13:44:09.807597973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 17 13:44:09.862227 containerd[1493]: time="2025-01-17T13:44:09.861770746Z" level=info msg="CreateContainer within sandbox \"21d4baf99076af6535d0fae9f6aba2978f1de0222f2032afcf380a80798a7e69\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 17 13:44:09.929646 containerd[1493]: time="2025-01-17T13:44:09.929451596Z" level=info msg="CreateContainer within sandbox \"21d4baf99076af6535d0fae9f6aba2978f1de0222f2032afcf380a80798a7e69\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2914b9d4584a28f6c890711032b436efe1fbd7208170f0561ad5554d9b39d2b6\"" Jan 17 13:44:09.932311 containerd[1493]: time="2025-01-17T13:44:09.932118876Z" level=info msg="StartContainer for \"2914b9d4584a28f6c890711032b436efe1fbd7208170f0561ad5554d9b39d2b6\"" Jan 17 13:44:10.011444 systemd[1]: Started cri-containerd-2914b9d4584a28f6c890711032b436efe1fbd7208170f0561ad5554d9b39d2b6.scope - libcontainer container 2914b9d4584a28f6c890711032b436efe1fbd7208170f0561ad5554d9b39d2b6. Jan 17 13:44:10.098227 containerd[1493]: time="2025-01-17T13:44:10.097435666Z" level=info msg="StartContainer for \"2914b9d4584a28f6c890711032b436efe1fbd7208170f0561ad5554d9b39d2b6\" returns successfully" Jan 17 13:44:10.401499 kubelet[2728]: I0117 13:44:10.400628 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-549dc69985-qcz2l" podStartSLOduration=3.297454779 podStartE2EDuration="6.400594659s" podCreationTimestamp="2025-01-17 13:44:04 +0000 UTC" firstStartedPulling="2025-01-17 13:44:06.704099734 +0000 UTC m=+24.691678291" lastFinishedPulling="2025-01-17 13:44:09.807239603 +0000 UTC m=+27.794818171" observedRunningTime="2025-01-17 13:44:10.399974279 +0000 UTC m=+28.387552854" watchObservedRunningTime="2025-01-17 13:44:10.400594659 +0000 UTC m=+28.388173239" Jan 17 13:44:10.451690 kubelet[2728]: E0117 13:44:10.451427 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.451690 kubelet[2728]: W0117 13:44:10.451465 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.451690 kubelet[2728]: E0117 13:44:10.451510 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.452255 kubelet[2728]: E0117 13:44:10.452065 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.452255 kubelet[2728]: W0117 13:44:10.452103 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.452255 kubelet[2728]: E0117 13:44:10.452122 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.453025 kubelet[2728]: E0117 13:44:10.452799 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.453025 kubelet[2728]: W0117 13:44:10.452819 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.453025 kubelet[2728]: E0117 13:44:10.452835 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.453659 kubelet[2728]: E0117 13:44:10.453441 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.453659 kubelet[2728]: W0117 13:44:10.453466 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.453659 kubelet[2728]: E0117 13:44:10.453482 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.454092 kubelet[2728]: E0117 13:44:10.454062 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.454382 kubelet[2728]: W0117 13:44:10.454228 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.454382 kubelet[2728]: E0117 13:44:10.454254 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.454936 kubelet[2728]: E0117 13:44:10.454736 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.454936 kubelet[2728]: W0117 13:44:10.454785 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.454936 kubelet[2728]: E0117 13:44:10.454803 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.455501 kubelet[2728]: E0117 13:44:10.455308 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.455501 kubelet[2728]: W0117 13:44:10.455347 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.455501 kubelet[2728]: E0117 13:44:10.455366 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.456140 kubelet[2728]: E0117 13:44:10.455910 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.456140 kubelet[2728]: W0117 13:44:10.455929 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.456140 kubelet[2728]: E0117 13:44:10.455966 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.456865 kubelet[2728]: E0117 13:44:10.456570 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.456865 kubelet[2728]: W0117 13:44:10.456589 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.456865 kubelet[2728]: E0117 13:44:10.456624 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.457338 kubelet[2728]: E0117 13:44:10.457137 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.457338 kubelet[2728]: W0117 13:44:10.457157 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.457338 kubelet[2728]: E0117 13:44:10.457173 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.458079 kubelet[2728]: E0117 13:44:10.457694 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.458079 kubelet[2728]: W0117 13:44:10.457709 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.458079 kubelet[2728]: E0117 13:44:10.457724 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.458550 kubelet[2728]: E0117 13:44:10.458393 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.458550 kubelet[2728]: W0117 13:44:10.458413 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.458550 kubelet[2728]: E0117 13:44:10.458429 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.458860 kubelet[2728]: E0117 13:44:10.458710 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.458860 kubelet[2728]: W0117 13:44:10.458725 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.459162 kubelet[2728]: E0117 13:44:10.459021 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.459508 kubelet[2728]: E0117 13:44:10.459342 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.459508 kubelet[2728]: W0117 13:44:10.459362 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.459508 kubelet[2728]: E0117 13:44:10.459378 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.459743 kubelet[2728]: E0117 13:44:10.459723 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.459856 kubelet[2728]: W0117 13:44:10.459835 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.459961 kubelet[2728]: E0117 13:44:10.459941 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.528444 kubelet[2728]: E0117 13:44:10.528327 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.528444 kubelet[2728]: W0117 13:44:10.528356 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.528444 kubelet[2728]: E0117 13:44:10.528381 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.528872 kubelet[2728]: E0117 13:44:10.528700 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.528872 kubelet[2728]: W0117 13:44:10.528722 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.528872 kubelet[2728]: E0117 13:44:10.528739 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.529261 kubelet[2728]: E0117 13:44:10.529033 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.529261 kubelet[2728]: W0117 13:44:10.529053 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.529261 kubelet[2728]: E0117 13:44:10.529083 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.529553 kubelet[2728]: E0117 13:44:10.529344 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.529553 kubelet[2728]: W0117 13:44:10.529360 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.529553 kubelet[2728]: E0117 13:44:10.529400 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.529895 kubelet[2728]: E0117 13:44:10.529788 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.529895 kubelet[2728]: W0117 13:44:10.529819 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.529895 kubelet[2728]: E0117 13:44:10.529849 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.530143 kubelet[2728]: E0117 13:44:10.530122 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.530143 kubelet[2728]: W0117 13:44:10.530143 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.530478 kubelet[2728]: E0117 13:44:10.530168 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.530478 kubelet[2728]: E0117 13:44:10.530464 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.530766 kubelet[2728]: W0117 13:44:10.530478 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.530766 kubelet[2728]: E0117 13:44:10.530504 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.531063 kubelet[2728]: E0117 13:44:10.530957 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.531063 kubelet[2728]: W0117 13:44:10.530989 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.531063 kubelet[2728]: E0117 13:44:10.531019 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.531333 kubelet[2728]: E0117 13:44:10.531309 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.531333 kubelet[2728]: W0117 13:44:10.531331 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.531446 kubelet[2728]: E0117 13:44:10.531367 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.531717 kubelet[2728]: E0117 13:44:10.531695 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.531717 kubelet[2728]: W0117 13:44:10.531716 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.531962 kubelet[2728]: E0117 13:44:10.531750 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.532145 kubelet[2728]: E0117 13:44:10.532125 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.532238 kubelet[2728]: W0117 13:44:10.532145 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.532954 kubelet[2728]: E0117 13:44:10.532293 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.532954 kubelet[2728]: E0117 13:44:10.532556 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.532954 kubelet[2728]: W0117 13:44:10.532571 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.532954 kubelet[2728]: E0117 13:44:10.532854 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.532954 kubelet[2728]: E0117 13:44:10.532858 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.533397 kubelet[2728]: W0117 13:44:10.532878 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.533397 kubelet[2728]: E0117 13:44:10.533253 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.533552 kubelet[2728]: E0117 13:44:10.533520 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.533552 kubelet[2728]: W0117 13:44:10.533545 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.533735 kubelet[2728]: E0117 13:44:10.533570 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.533869 kubelet[2728]: E0117 13:44:10.533856 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.534194 kubelet[2728]: W0117 13:44:10.533871 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.534194 kubelet[2728]: E0117 13:44:10.533895 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.534560 kubelet[2728]: E0117 13:44:10.534385 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.534560 kubelet[2728]: W0117 13:44:10.534406 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.534560 kubelet[2728]: E0117 13:44:10.534433 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.534816 kubelet[2728]: E0117 13:44:10.534795 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.535095 kubelet[2728]: W0117 13:44:10.534917 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.535095 kubelet[2728]: E0117 13:44:10.534954 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:10.535311 kubelet[2728]: E0117 13:44:10.535291 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:10.535436 kubelet[2728]: W0117 13:44:10.535414 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:10.535573 kubelet[2728]: E0117 13:44:10.535535 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.283892 kubelet[2728]: E0117 13:44:11.282684 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:11.466229 kubelet[2728]: E0117 13:44:11.466156 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.466229 kubelet[2728]: W0117 13:44:11.466218 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.466827 kubelet[2728]: E0117 13:44:11.466248 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.466827 kubelet[2728]: E0117 13:44:11.466662 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.466827 kubelet[2728]: W0117 13:44:11.466677 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.466827 kubelet[2728]: E0117 13:44:11.466694 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.467096 kubelet[2728]: E0117 13:44:11.467060 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.467192 kubelet[2728]: W0117 13:44:11.467157 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.467267 kubelet[2728]: E0117 13:44:11.467210 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.467589 kubelet[2728]: E0117 13:44:11.467554 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.467589 kubelet[2728]: W0117 13:44:11.467581 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.467698 kubelet[2728]: E0117 13:44:11.467598 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.467998 kubelet[2728]: E0117 13:44:11.467965 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.467998 kubelet[2728]: W0117 13:44:11.467989 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.468109 kubelet[2728]: E0117 13:44:11.468006 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.468382 kubelet[2728]: E0117 13:44:11.468349 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.468461 kubelet[2728]: W0117 13:44:11.468391 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.468461 kubelet[2728]: E0117 13:44:11.468409 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.468766 kubelet[2728]: E0117 13:44:11.468736 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.468766 kubelet[2728]: W0117 13:44:11.468760 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.468886 kubelet[2728]: E0117 13:44:11.468777 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.469124 kubelet[2728]: E0117 13:44:11.469102 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.469340 kubelet[2728]: W0117 13:44:11.469125 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.469340 kubelet[2728]: E0117 13:44:11.469143 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.470575 kubelet[2728]: E0117 13:44:11.469567 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.470575 kubelet[2728]: W0117 13:44:11.469590 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.470575 kubelet[2728]: E0117 13:44:11.469625 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.470575 kubelet[2728]: E0117 13:44:11.469974 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.470575 kubelet[2728]: W0117 13:44:11.469989 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.470575 kubelet[2728]: E0117 13:44:11.470004 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.470575 kubelet[2728]: E0117 13:44:11.470303 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.470575 kubelet[2728]: W0117 13:44:11.470317 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.470575 kubelet[2728]: E0117 13:44:11.470346 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.471037 kubelet[2728]: E0117 13:44:11.470638 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.471037 kubelet[2728]: W0117 13:44:11.470654 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.471037 kubelet[2728]: E0117 13:44:11.470696 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.471037 kubelet[2728]: E0117 13:44:11.471011 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.471037 kubelet[2728]: W0117 13:44:11.471025 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.471770 kubelet[2728]: E0117 13:44:11.471040 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.471770 kubelet[2728]: E0117 13:44:11.471363 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.471770 kubelet[2728]: W0117 13:44:11.471377 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.471770 kubelet[2728]: E0117 13:44:11.471391 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.471770 kubelet[2728]: E0117 13:44:11.471687 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.471770 kubelet[2728]: W0117 13:44:11.471704 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.471770 kubelet[2728]: E0117 13:44:11.471731 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.538642 kubelet[2728]: E0117 13:44:11.538458 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.538642 kubelet[2728]: W0117 13:44:11.538499 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.538642 kubelet[2728]: E0117 13:44:11.538531 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.539953 kubelet[2728]: E0117 13:44:11.539895 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.539953 kubelet[2728]: W0117 13:44:11.539917 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.539953 kubelet[2728]: E0117 13:44:11.539950 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.540347 kubelet[2728]: E0117 13:44:11.540296 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.540347 kubelet[2728]: W0117 13:44:11.540319 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.540347 kubelet[2728]: E0117 13:44:11.540344 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.541045 kubelet[2728]: E0117 13:44:11.540629 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.541045 kubelet[2728]: W0117 13:44:11.540644 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.541045 kubelet[2728]: E0117 13:44:11.540771 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.541045 kubelet[2728]: E0117 13:44:11.540886 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.541045 kubelet[2728]: W0117 13:44:11.540900 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.541340 kubelet[2728]: E0117 13:44:11.541197 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.541340 kubelet[2728]: W0117 13:44:11.541225 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.541340 kubelet[2728]: E0117 13:44:11.541241 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.541551 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.542782 kubelet[2728]: W0117 13:44:11.541574 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.541592 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.541827 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.542782 kubelet[2728]: W0117 13:44:11.541843 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.541858 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.542161 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.542782 kubelet[2728]: W0117 13:44:11.542175 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.542210 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.542782 kubelet[2728]: E0117 13:44:11.542417 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.543327 kubelet[2728]: E0117 13:44:11.542784 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.543327 kubelet[2728]: W0117 13:44:11.542810 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.543327 kubelet[2728]: E0117 13:44:11.542836 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.543509 kubelet[2728]: E0117 13:44:11.543414 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.543509 kubelet[2728]: W0117 13:44:11.543429 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.543509 kubelet[2728]: E0117 13:44:11.543464 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.543829 kubelet[2728]: E0117 13:44:11.543809 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.543829 kubelet[2728]: W0117 13:44:11.543829 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.543967 kubelet[2728]: E0117 13:44:11.543953 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.544562 kubelet[2728]: E0117 13:44:11.544494 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.544562 kubelet[2728]: W0117 13:44:11.544519 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.544779 kubelet[2728]: E0117 13:44:11.544721 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.544779 kubelet[2728]: E0117 13:44:11.544774 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.544961 kubelet[2728]: W0117 13:44:11.544788 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.544961 kubelet[2728]: E0117 13:44:11.544883 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.545160 kubelet[2728]: E0117 13:44:11.545129 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.545160 kubelet[2728]: W0117 13:44:11.545153 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.545345 kubelet[2728]: E0117 13:44:11.545192 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.545536 kubelet[2728]: E0117 13:44:11.545505 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.545536 kubelet[2728]: W0117 13:44:11.545529 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.545657 kubelet[2728]: E0117 13:44:11.545546 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.546495 kubelet[2728]: E0117 13:44:11.546407 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.546495 kubelet[2728]: W0117 13:44:11.546427 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.547144 kubelet[2728]: E0117 13:44:11.546462 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.547421 kubelet[2728]: E0117 13:44:11.547399 2728 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 17 13:44:11.547523 kubelet[2728]: W0117 13:44:11.547503 2728 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 17 13:44:11.547633 kubelet[2728]: E0117 13:44:11.547612 2728 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 17 13:44:11.596407 containerd[1493]: time="2025-01-17T13:44:11.596132264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:11.597803 containerd[1493]: time="2025-01-17T13:44:11.597499250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 17 13:44:11.598729 containerd[1493]: time="2025-01-17T13:44:11.598642890Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:11.603020 containerd[1493]: time="2025-01-17T13:44:11.602932652Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:11.604693 containerd[1493]: time="2025-01-17T13:44:11.604422232Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.795554703s" Jan 17 13:44:11.604693 containerd[1493]: time="2025-01-17T13:44:11.604471460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 17 13:44:11.610478 containerd[1493]: time="2025-01-17T13:44:11.610418554Z" level=info msg="CreateContainer within sandbox \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 17 13:44:11.630211 containerd[1493]: time="2025-01-17T13:44:11.630108947Z" level=info msg="CreateContainer within sandbox \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f\"" Jan 17 13:44:11.631592 containerd[1493]: time="2025-01-17T13:44:11.630891831Z" level=info msg="StartContainer for \"d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f\"" Jan 17 13:44:11.681142 systemd[1]: run-containerd-runc-k8s.io-d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f-runc.tYwT2q.mount: Deactivated successfully. Jan 17 13:44:11.696434 systemd[1]: Started cri-containerd-d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f.scope - libcontainer container d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f. Jan 17 13:44:11.751490 containerd[1493]: time="2025-01-17T13:44:11.751380586Z" level=info msg="StartContainer for \"d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f\" returns successfully" Jan 17 13:44:11.771886 systemd[1]: cri-containerd-d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f.scope: Deactivated successfully. Jan 17 13:44:11.832355 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f-rootfs.mount: Deactivated successfully. Jan 17 13:44:11.834133 containerd[1493]: time="2025-01-17T13:44:11.821461916Z" level=info msg="shim disconnected" id=d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f namespace=k8s.io Jan 17 13:44:11.834133 containerd[1493]: time="2025-01-17T13:44:11.832981604Z" level=warning msg="cleaning up after shim disconnected" id=d036582665d0b654ccbf6ba2e8b7d6a342835c68b164c1e644021616b71d033f namespace=k8s.io Jan 17 13:44:11.834133 containerd[1493]: time="2025-01-17T13:44:11.833017699Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 17 13:44:12.397217 containerd[1493]: time="2025-01-17T13:44:12.396439034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 17 13:44:13.282307 kubelet[2728]: E0117 13:44:13.282201 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:15.282546 kubelet[2728]: E0117 13:44:15.282262 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:17.282025 kubelet[2728]: E0117 13:44:17.281940 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:18.625123 containerd[1493]: time="2025-01-17T13:44:18.624897566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:18.626543 containerd[1493]: time="2025-01-17T13:44:18.626450893Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 17 13:44:18.627607 containerd[1493]: time="2025-01-17T13:44:18.627494680Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:18.630718 containerd[1493]: time="2025-01-17T13:44:18.630648077Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:18.631970 containerd[1493]: time="2025-01-17T13:44:18.631782152Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 6.235278219s" Jan 17 13:44:18.631970 containerd[1493]: time="2025-01-17T13:44:18.631835781Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 17 13:44:18.635927 containerd[1493]: time="2025-01-17T13:44:18.635864741Z" level=info msg="CreateContainer within sandbox \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 17 13:44:18.660909 containerd[1493]: time="2025-01-17T13:44:18.660836215Z" level=info msg="CreateContainer within sandbox \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af\"" Jan 17 13:44:18.661767 containerd[1493]: time="2025-01-17T13:44:18.661670272Z" level=info msg="StartContainer for \"b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af\"" Jan 17 13:44:18.746437 systemd[1]: Started cri-containerd-b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af.scope - libcontainer container b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af. Jan 17 13:44:18.799827 containerd[1493]: time="2025-01-17T13:44:18.799099206Z" level=info msg="StartContainer for \"b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af\" returns successfully" Jan 17 13:44:19.282691 kubelet[2728]: E0117 13:44:19.282553 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:19.850411 systemd[1]: cri-containerd-b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af.scope: Deactivated successfully. Jan 17 13:44:19.906850 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af-rootfs.mount: Deactivated successfully. Jan 17 13:44:19.908478 containerd[1493]: time="2025-01-17T13:44:19.908235134Z" level=info msg="shim disconnected" id=b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af namespace=k8s.io Jan 17 13:44:19.908478 containerd[1493]: time="2025-01-17T13:44:19.908432710Z" level=warning msg="cleaning up after shim disconnected" id=b5085119f01053366b1439609c8b6d5a8720408a9b98edbfa4b9e5d03f7dc5af namespace=k8s.io Jan 17 13:44:19.909085 containerd[1493]: time="2025-01-17T13:44:19.908488736Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 17 13:44:19.929509 containerd[1493]: time="2025-01-17T13:44:19.929393334Z" level=warning msg="cleanup warnings time=\"2025-01-17T13:44:19Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Jan 17 13:44:19.938570 kubelet[2728]: I0117 13:44:19.937436 2728 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 17 13:44:19.979027 kubelet[2728]: I0117 13:44:19.977938 2728 topology_manager.go:215] "Topology Admit Handler" podUID="17460e26-fb78-4395-87e8-c326be51e429" podNamespace="kube-system" podName="coredns-7db6d8ff4d-267m7" Jan 17 13:44:19.988539 kubelet[2728]: W0117 13:44:19.987508 2728 reflector.go:547] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:19.988539 kubelet[2728]: E0117 13:44:19.987589 2728 reflector.go:150] object-"kube-system"/"coredns": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:srv-so9hk.gb1.brightbox.com" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'srv-so9hk.gb1.brightbox.com' and this object Jan 17 13:44:19.997324 systemd[1]: Created slice kubepods-burstable-pod17460e26_fb78_4395_87e8_c326be51e429.slice - libcontainer container kubepods-burstable-pod17460e26_fb78_4395_87e8_c326be51e429.slice. Jan 17 13:44:20.000846 kubelet[2728]: I0117 13:44:20.000399 2728 topology_manager.go:215] "Topology Admit Handler" podUID="a205ba3a-791a-42f3-a1d4-a853e21d6652" podNamespace="calico-apiserver" podName="calico-apiserver-55b8dc98c7-f9vfx" Jan 17 13:44:20.000846 kubelet[2728]: I0117 13:44:20.000839 2728 topology_manager.go:215] "Topology Admit Handler" podUID="aede887a-58e3-4229-a42e-9f14a0e48c86" podNamespace="kube-system" podName="coredns-7db6d8ff4d-nlgqc" Jan 17 13:44:20.001278 kubelet[2728]: I0117 13:44:20.001222 2728 topology_manager.go:215] "Topology Admit Handler" podUID="a817919e-4a2a-40c6-a08c-acbbfe256250" podNamespace="calico-system" podName="calico-kube-controllers-848c57f8ff-m75gk" Jan 17 13:44:20.004653 kubelet[2728]: I0117 13:44:20.003565 2728 topology_manager.go:215] "Topology Admit Handler" podUID="3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3" podNamespace="calico-apiserver" podName="calico-apiserver-55b8dc98c7-xbj84" Jan 17 13:44:20.022860 systemd[1]: Created slice kubepods-besteffort-poda817919e_4a2a_40c6_a08c_acbbfe256250.slice - libcontainer container kubepods-besteffort-poda817919e_4a2a_40c6_a08c_acbbfe256250.slice. Jan 17 13:44:20.042208 systemd[1]: Created slice kubepods-besteffort-poda205ba3a_791a_42f3_a1d4_a853e21d6652.slice - libcontainer container kubepods-besteffort-poda205ba3a_791a_42f3_a1d4_a853e21d6652.slice. Jan 17 13:44:20.061583 systemd[1]: Created slice kubepods-burstable-podaede887a_58e3_4229_a42e_9f14a0e48c86.slice - libcontainer container kubepods-burstable-podaede887a_58e3_4229_a42e_9f14a0e48c86.slice. Jan 17 13:44:20.079950 systemd[1]: Created slice kubepods-besteffort-pod3dbbcf76_ddc3_459a_8f0a_cf5ce58b5fe3.slice - libcontainer container kubepods-besteffort-pod3dbbcf76_ddc3_459a_8f0a_cf5ce58b5fe3.slice. Jan 17 13:44:20.110015 kubelet[2728]: I0117 13:44:20.108894 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kvxs\" (UniqueName: \"kubernetes.io/projected/aede887a-58e3-4229-a42e-9f14a0e48c86-kube-api-access-7kvxs\") pod \"coredns-7db6d8ff4d-nlgqc\" (UID: \"aede887a-58e3-4229-a42e-9f14a0e48c86\") " pod="kube-system/coredns-7db6d8ff4d-nlgqc" Jan 17 13:44:20.110503 kubelet[2728]: I0117 13:44:20.110304 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a205ba3a-791a-42f3-a1d4-a853e21d6652-calico-apiserver-certs\") pod \"calico-apiserver-55b8dc98c7-f9vfx\" (UID: \"a205ba3a-791a-42f3-a1d4-a853e21d6652\") " pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" Jan 17 13:44:20.110503 kubelet[2728]: I0117 13:44:20.110393 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb75b\" (UniqueName: \"kubernetes.io/projected/a205ba3a-791a-42f3-a1d4-a853e21d6652-kube-api-access-xb75b\") pod \"calico-apiserver-55b8dc98c7-f9vfx\" (UID: \"a205ba3a-791a-42f3-a1d4-a853e21d6652\") " pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" Jan 17 13:44:20.110503 kubelet[2728]: I0117 13:44:20.110470 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17460e26-fb78-4395-87e8-c326be51e429-config-volume\") pod \"coredns-7db6d8ff4d-267m7\" (UID: \"17460e26-fb78-4395-87e8-c326be51e429\") " pod="kube-system/coredns-7db6d8ff4d-267m7" Jan 17 13:44:20.111325 kubelet[2728]: I0117 13:44:20.110759 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkvx2\" (UniqueName: \"kubernetes.io/projected/3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3-kube-api-access-dkvx2\") pod \"calico-apiserver-55b8dc98c7-xbj84\" (UID: \"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3\") " pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" Jan 17 13:44:20.111325 kubelet[2728]: I0117 13:44:20.110820 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bwzz\" (UniqueName: \"kubernetes.io/projected/17460e26-fb78-4395-87e8-c326be51e429-kube-api-access-4bwzz\") pod \"coredns-7db6d8ff4d-267m7\" (UID: \"17460e26-fb78-4395-87e8-c326be51e429\") " pod="kube-system/coredns-7db6d8ff4d-267m7" Jan 17 13:44:20.111325 kubelet[2728]: I0117 13:44:20.110883 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a817919e-4a2a-40c6-a08c-acbbfe256250-tigera-ca-bundle\") pod \"calico-kube-controllers-848c57f8ff-m75gk\" (UID: \"a817919e-4a2a-40c6-a08c-acbbfe256250\") " pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" Jan 17 13:44:20.111325 kubelet[2728]: I0117 13:44:20.110918 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3-calico-apiserver-certs\") pod \"calico-apiserver-55b8dc98c7-xbj84\" (UID: \"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3\") " pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" Jan 17 13:44:20.111325 kubelet[2728]: I0117 13:44:20.110988 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aede887a-58e3-4229-a42e-9f14a0e48c86-config-volume\") pod \"coredns-7db6d8ff4d-nlgqc\" (UID: \"aede887a-58e3-4229-a42e-9f14a0e48c86\") " pod="kube-system/coredns-7db6d8ff4d-nlgqc" Jan 17 13:44:20.111608 kubelet[2728]: I0117 13:44:20.111063 2728 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjmg\" (UniqueName: \"kubernetes.io/projected/a817919e-4a2a-40c6-a08c-acbbfe256250-kube-api-access-xzjmg\") pod \"calico-kube-controllers-848c57f8ff-m75gk\" (UID: \"a817919e-4a2a-40c6-a08c-acbbfe256250\") " pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" Jan 17 13:44:20.336037 containerd[1493]: time="2025-01-17T13:44:20.335667620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-848c57f8ff-m75gk,Uid:a817919e-4a2a-40c6-a08c-acbbfe256250,Namespace:calico-system,Attempt:0,}" Jan 17 13:44:20.354807 containerd[1493]: time="2025-01-17T13:44:20.354402759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-f9vfx,Uid:a205ba3a-791a-42f3-a1d4-a853e21d6652,Namespace:calico-apiserver,Attempt:0,}" Jan 17 13:44:20.388837 containerd[1493]: time="2025-01-17T13:44:20.388343219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-xbj84,Uid:3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3,Namespace:calico-apiserver,Attempt:0,}" Jan 17 13:44:20.477123 containerd[1493]: time="2025-01-17T13:44:20.476530864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 17 13:44:20.689992 containerd[1493]: time="2025-01-17T13:44:20.689830079Z" level=error msg="Failed to destroy network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.698473 containerd[1493]: time="2025-01-17T13:44:20.698370266Z" level=error msg="encountered an error cleaning up failed sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.698829 containerd[1493]: time="2025-01-17T13:44:20.698678492Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-848c57f8ff-m75gk,Uid:a817919e-4a2a-40c6-a08c-acbbfe256250,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.699395 containerd[1493]: time="2025-01-17T13:44:20.699339728Z" level=error msg="Failed to destroy network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.700708 containerd[1493]: time="2025-01-17T13:44:20.700670035Z" level=error msg="encountered an error cleaning up failed sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.703249 containerd[1493]: time="2025-01-17T13:44:20.700882130Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-f9vfx,Uid:a205ba3a-791a-42f3-a1d4-a853e21d6652,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.705207 containerd[1493]: time="2025-01-17T13:44:20.704778895Z" level=error msg="Failed to destroy network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.705344 containerd[1493]: time="2025-01-17T13:44:20.705191005Z" level=error msg="encountered an error cleaning up failed sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.705521 containerd[1493]: time="2025-01-17T13:44:20.705448226Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-xbj84,Uid:3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.706087 kubelet[2728]: E0117 13:44:20.705998 2728 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.707249 kubelet[2728]: E0117 13:44:20.706021 2728 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.707249 kubelet[2728]: E0117 13:44:20.706659 2728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" Jan 17 13:44:20.707249 kubelet[2728]: E0117 13:44:20.706719 2728 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" Jan 17 13:44:20.707249 kubelet[2728]: E0117 13:44:20.706659 2728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" Jan 17 13:44:20.707684 kubelet[2728]: E0117 13:44:20.706837 2728 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" Jan 17 13:44:20.707684 kubelet[2728]: E0117 13:44:20.706892 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-848c57f8ff-m75gk_calico-system(a817919e-4a2a-40c6-a08c-acbbfe256250)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-848c57f8ff-m75gk_calico-system(a817919e-4a2a-40c6-a08c-acbbfe256250)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" podUID="a817919e-4a2a-40c6-a08c-acbbfe256250" Jan 17 13:44:20.707684 kubelet[2728]: E0117 13:44:20.706071 2728 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:20.708137 kubelet[2728]: E0117 13:44:20.706972 2728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" Jan 17 13:44:20.708137 kubelet[2728]: E0117 13:44:20.706998 2728 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" Jan 17 13:44:20.708137 kubelet[2728]: E0117 13:44:20.707045 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55b8dc98c7-f9vfx_calico-apiserver(a205ba3a-791a-42f3-a1d4-a853e21d6652)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55b8dc98c7-f9vfx_calico-apiserver(a205ba3a-791a-42f3-a1d4-a853e21d6652)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" podUID="a205ba3a-791a-42f3-a1d4-a853e21d6652" Jan 17 13:44:20.708620 kubelet[2728]: E0117 13:44:20.706790 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55b8dc98c7-xbj84_calico-apiserver(3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55b8dc98c7-xbj84_calico-apiserver(3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" podUID="3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3" Jan 17 13:44:21.230440 kubelet[2728]: E0117 13:44:21.230294 2728 configmap.go:199] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:21.230660 kubelet[2728]: E0117 13:44:21.230607 2728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/17460e26-fb78-4395-87e8-c326be51e429-config-volume podName:17460e26-fb78-4395-87e8-c326be51e429 nodeName:}" failed. No retries permitted until 2025-01-17 13:44:21.730535976 +0000 UTC m=+39.718114532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/17460e26-fb78-4395-87e8-c326be51e429-config-volume") pod "coredns-7db6d8ff4d-267m7" (UID: "17460e26-fb78-4395-87e8-c326be51e429") : failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:21.232620 kubelet[2728]: E0117 13:44:21.232444 2728 configmap.go:199] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:21.232620 kubelet[2728]: E0117 13:44:21.232572 2728 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/aede887a-58e3-4229-a42e-9f14a0e48c86-config-volume podName:aede887a-58e3-4229-a42e-9f14a0e48c86 nodeName:}" failed. No retries permitted until 2025-01-17 13:44:21.732538436 +0000 UTC m=+39.720117004 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/aede887a-58e3-4229-a42e-9f14a0e48c86-config-volume") pod "coredns-7db6d8ff4d-nlgqc" (UID: "aede887a-58e3-4229-a42e-9f14a0e48c86") : failed to sync configmap cache: timed out waiting for the condition Jan 17 13:44:21.292355 systemd[1]: Created slice kubepods-besteffort-pod41e59e9c_f5c4_48af_a614_7a43cf86d00d.slice - libcontainer container kubepods-besteffort-pod41e59e9c_f5c4_48af_a614_7a43cf86d00d.slice. Jan 17 13:44:21.297444 containerd[1493]: time="2025-01-17T13:44:21.297381117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9fwmg,Uid:41e59e9c-f5c4-48af-a614-7a43cf86d00d,Namespace:calico-system,Attempt:0,}" Jan 17 13:44:21.410118 containerd[1493]: time="2025-01-17T13:44:21.409810657Z" level=error msg="Failed to destroy network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.411435 containerd[1493]: time="2025-01-17T13:44:21.410958070Z" level=error msg="encountered an error cleaning up failed sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.411435 containerd[1493]: time="2025-01-17T13:44:21.411063470Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9fwmg,Uid:41e59e9c-f5c4-48af-a614-7a43cf86d00d,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.412549 kubelet[2728]: E0117 13:44:21.412464 2728 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.413087 kubelet[2728]: E0117 13:44:21.412812 2728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:21.413087 kubelet[2728]: E0117 13:44:21.412889 2728 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9fwmg" Jan 17 13:44:21.413433 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74-shm.mount: Deactivated successfully. Jan 17 13:44:21.416705 kubelet[2728]: E0117 13:44:21.414132 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9fwmg_calico-system(41e59e9c-f5c4-48af-a614-7a43cf86d00d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9fwmg_calico-system(41e59e9c-f5c4-48af-a614-7a43cf86d00d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:21.482620 kubelet[2728]: I0117 13:44:21.481614 2728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:21.489086 kubelet[2728]: I0117 13:44:21.488034 2728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:21.489998 containerd[1493]: time="2025-01-17T13:44:21.489363069Z" level=info msg="StopPodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\"" Jan 17 13:44:21.490347 containerd[1493]: time="2025-01-17T13:44:21.490269021Z" level=info msg="StopPodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\"" Jan 17 13:44:21.491775 containerd[1493]: time="2025-01-17T13:44:21.491725888Z" level=info msg="Ensure that sandbox ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20 in task-service has been cleanup successfully" Jan 17 13:44:21.492266 containerd[1493]: time="2025-01-17T13:44:21.492227533Z" level=info msg="Ensure that sandbox f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74 in task-service has been cleanup successfully" Jan 17 13:44:21.496930 kubelet[2728]: I0117 13:44:21.496832 2728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:21.499371 containerd[1493]: time="2025-01-17T13:44:21.499235523Z" level=info msg="StopPodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\"" Jan 17 13:44:21.500979 containerd[1493]: time="2025-01-17T13:44:21.500474645Z" level=info msg="Ensure that sandbox 8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31 in task-service has been cleanup successfully" Jan 17 13:44:21.501197 kubelet[2728]: I0117 13:44:21.501155 2728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:21.502789 containerd[1493]: time="2025-01-17T13:44:21.502747234Z" level=info msg="StopPodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\"" Jan 17 13:44:21.503132 containerd[1493]: time="2025-01-17T13:44:21.503102761Z" level=info msg="Ensure that sandbox fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf in task-service has been cleanup successfully" Jan 17 13:44:21.605811 containerd[1493]: time="2025-01-17T13:44:21.605713432Z" level=error msg="StopPodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" failed" error="failed to destroy network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.606130 containerd[1493]: time="2025-01-17T13:44:21.605714981Z" level=error msg="StopPodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" failed" error="failed to destroy network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.606130 containerd[1493]: time="2025-01-17T13:44:21.605775729Z" level=error msg="StopPodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" failed" error="failed to destroy network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.607231 kubelet[2728]: E0117 13:44:21.606665 2728 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:21.607231 kubelet[2728]: E0117 13:44:21.607085 2728 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74"} Jan 17 13:44:21.607401 kubelet[2728]: E0117 13:44:21.606664 2728 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:21.607401 kubelet[2728]: E0117 13:44:21.607284 2728 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31"} Jan 17 13:44:21.607401 kubelet[2728]: E0117 13:44:21.607360 2728 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a817919e-4a2a-40c6-a08c-acbbfe256250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 17 13:44:21.607679 kubelet[2728]: E0117 13:44:21.607416 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a817919e-4a2a-40c6-a08c-acbbfe256250\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" podUID="a817919e-4a2a-40c6-a08c-acbbfe256250" Jan 17 13:44:21.607679 kubelet[2728]: E0117 13:44:21.606751 2728 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:21.607679 kubelet[2728]: E0117 13:44:21.607565 2728 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20"} Jan 17 13:44:21.607679 kubelet[2728]: E0117 13:44:21.607664 2728 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"a205ba3a-791a-42f3-a1d4-a853e21d6652\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 17 13:44:21.609121 kubelet[2728]: E0117 13:44:21.607699 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"a205ba3a-791a-42f3-a1d4-a853e21d6652\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" podUID="a205ba3a-791a-42f3-a1d4-a853e21d6652" Jan 17 13:44:21.609121 kubelet[2728]: E0117 13:44:21.607912 2728 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 17 13:44:21.609121 kubelet[2728]: E0117 13:44:21.607951 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"41e59e9c-f5c4-48af-a614-7a43cf86d00d\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9fwmg" podUID="41e59e9c-f5c4-48af-a614-7a43cf86d00d" Jan 17 13:44:21.614538 containerd[1493]: time="2025-01-17T13:44:21.614467013Z" level=error msg="StopPodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" failed" error="failed to destroy network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:21.614883 kubelet[2728]: E0117 13:44:21.614819 2728 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:21.614977 kubelet[2728]: E0117 13:44:21.614899 2728 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf"} Jan 17 13:44:21.614977 kubelet[2728]: E0117 13:44:21.614952 2728 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 17 13:44:21.615136 kubelet[2728]: E0117 13:44:21.615010 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" podUID="3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3" Jan 17 13:44:21.871429 containerd[1493]: time="2025-01-17T13:44:21.871370764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nlgqc,Uid:aede887a-58e3-4229-a42e-9f14a0e48c86,Namespace:kube-system,Attempt:0,}" Jan 17 13:44:22.008679 containerd[1493]: time="2025-01-17T13:44:22.008569277Z" level=error msg="Failed to destroy network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.010354 containerd[1493]: time="2025-01-17T13:44:22.009596063Z" level=error msg="encountered an error cleaning up failed sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.010354 containerd[1493]: time="2025-01-17T13:44:22.009676024Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nlgqc,Uid:aede887a-58e3-4229-a42e-9f14a0e48c86,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.014226 kubelet[2728]: E0117 13:44:22.011403 2728 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.014226 kubelet[2728]: E0117 13:44:22.011550 2728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-nlgqc" Jan 17 13:44:22.014226 kubelet[2728]: E0117 13:44:22.011603 2728 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-nlgqc" Jan 17 13:44:22.013964 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4-shm.mount: Deactivated successfully. Jan 17 13:44:22.014951 kubelet[2728]: E0117 13:44:22.011732 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-nlgqc_kube-system(aede887a-58e3-4229-a42e-9f14a0e48c86)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-nlgqc_kube-system(aede887a-58e3-4229-a42e-9f14a0e48c86)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-nlgqc" podUID="aede887a-58e3-4229-a42e-9f14a0e48c86" Jan 17 13:44:22.109765 containerd[1493]: time="2025-01-17T13:44:22.109661903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-267m7,Uid:17460e26-fb78-4395-87e8-c326be51e429,Namespace:kube-system,Attempt:0,}" Jan 17 13:44:22.257038 containerd[1493]: time="2025-01-17T13:44:22.256831150Z" level=error msg="Failed to destroy network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.259800 containerd[1493]: time="2025-01-17T13:44:22.259722188Z" level=error msg="encountered an error cleaning up failed sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.260378 containerd[1493]: time="2025-01-17T13:44:22.259978862Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-267m7,Uid:17460e26-fb78-4395-87e8-c326be51e429,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.261269 kubelet[2728]: E0117 13:44:22.260808 2728 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.261269 kubelet[2728]: E0117 13:44:22.261026 2728 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-267m7" Jan 17 13:44:22.261269 kubelet[2728]: E0117 13:44:22.261065 2728 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-267m7" Jan 17 13:44:22.261501 kubelet[2728]: E0117 13:44:22.261280 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-267m7_kube-system(17460e26-fb78-4395-87e8-c326be51e429)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-267m7_kube-system(17460e26-fb78-4395-87e8-c326be51e429)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-267m7" podUID="17460e26-fb78-4395-87e8-c326be51e429" Jan 17 13:44:22.506689 kubelet[2728]: I0117 13:44:22.506358 2728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:22.509104 kubelet[2728]: I0117 13:44:22.508316 2728 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:22.510333 containerd[1493]: time="2025-01-17T13:44:22.510089218Z" level=info msg="StopPodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\"" Jan 17 13:44:22.511825 containerd[1493]: time="2025-01-17T13:44:22.511243104Z" level=info msg="Ensure that sandbox 28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4 in task-service has been cleanup successfully" Jan 17 13:44:22.513208 containerd[1493]: time="2025-01-17T13:44:22.512323728Z" level=info msg="StopPodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\"" Jan 17 13:44:22.513208 containerd[1493]: time="2025-01-17T13:44:22.512745954Z" level=info msg="Ensure that sandbox 0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3 in task-service has been cleanup successfully" Jan 17 13:44:22.604870 containerd[1493]: time="2025-01-17T13:44:22.604769876Z" level=error msg="StopPodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" failed" error="failed to destroy network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.605364 kubelet[2728]: E0117 13:44:22.605297 2728 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:22.606312 kubelet[2728]: E0117 13:44:22.606031 2728 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3"} Jan 17 13:44:22.606312 kubelet[2728]: E0117 13:44:22.606123 2728 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"17460e26-fb78-4395-87e8-c326be51e429\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 17 13:44:22.606312 kubelet[2728]: E0117 13:44:22.606249 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"17460e26-fb78-4395-87e8-c326be51e429\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-267m7" podUID="17460e26-fb78-4395-87e8-c326be51e429" Jan 17 13:44:22.622627 containerd[1493]: time="2025-01-17T13:44:22.622564174Z" level=error msg="StopPodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" failed" error="failed to destroy network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 17 13:44:22.623203 kubelet[2728]: E0117 13:44:22.623112 2728 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:22.623297 kubelet[2728]: E0117 13:44:22.623255 2728 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4"} Jan 17 13:44:22.623374 kubelet[2728]: E0117 13:44:22.623323 2728 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"aede887a-58e3-4229-a42e-9f14a0e48c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Jan 17 13:44:22.623470 kubelet[2728]: E0117 13:44:22.623367 2728 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"aede887a-58e3-4229-a42e-9f14a0e48c86\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-nlgqc" podUID="aede887a-58e3-4229-a42e-9f14a0e48c86" Jan 17 13:44:22.907917 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3-shm.mount: Deactivated successfully. Jan 17 13:44:31.140852 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1676071762.mount: Deactivated successfully. Jan 17 13:44:31.321734 containerd[1493]: time="2025-01-17T13:44:31.293383222Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 17 13:44:31.326259 containerd[1493]: time="2025-01-17T13:44:31.325053197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:31.362983 containerd[1493]: time="2025-01-17T13:44:31.362942063Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:31.364427 containerd[1493]: time="2025-01-17T13:44:31.364386601Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:31.369789 containerd[1493]: time="2025-01-17T13:44:31.369748974Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 10.887929892s" Jan 17 13:44:31.369950 containerd[1493]: time="2025-01-17T13:44:31.369804257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 17 13:44:31.443784 containerd[1493]: time="2025-01-17T13:44:31.443505053Z" level=info msg="CreateContainer within sandbox \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 17 13:44:31.525922 containerd[1493]: time="2025-01-17T13:44:31.525860916Z" level=info msg="CreateContainer within sandbox \"924e75464a261d9646f988b7e46bc53c3425cf2ce7f7e4277a73f469ac12992e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271\"" Jan 17 13:44:31.532645 containerd[1493]: time="2025-01-17T13:44:31.532589155Z" level=info msg="StartContainer for \"669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271\"" Jan 17 13:44:31.708150 systemd[1]: Started cri-containerd-669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271.scope - libcontainer container 669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271. Jan 17 13:44:31.785878 containerd[1493]: time="2025-01-17T13:44:31.785796237Z" level=info msg="StartContainer for \"669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271\" returns successfully" Jan 17 13:44:31.988787 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 17 13:44:31.990301 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 17 13:44:32.613506 kubelet[2728]: I0117 13:44:32.611723 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-c7vcc" podStartSLOduration=4.005504069 podStartE2EDuration="28.595385624s" podCreationTimestamp="2025-01-17 13:44:04 +0000 UTC" firstStartedPulling="2025-01-17 13:44:06.786103106 +0000 UTC m=+24.773681659" lastFinishedPulling="2025-01-17 13:44:31.375984649 +0000 UTC m=+49.363563214" observedRunningTime="2025-01-17 13:44:32.592163257 +0000 UTC m=+50.579741851" watchObservedRunningTime="2025-01-17 13:44:32.595385624 +0000 UTC m=+50.582964187" Jan 17 13:44:33.285917 containerd[1493]: time="2025-01-17T13:44:33.284275759Z" level=info msg="StopPodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\"" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.389 [INFO][3991] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.391 [INFO][3991] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" iface="eth0" netns="/var/run/netns/cni-f10e96c4-53aa-6b1a-c1aa-2beefa70060c" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.391 [INFO][3991] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" iface="eth0" netns="/var/run/netns/cni-f10e96c4-53aa-6b1a-c1aa-2beefa70060c" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.394 [INFO][3991] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" iface="eth0" netns="/var/run/netns/cni-f10e96c4-53aa-6b1a-c1aa-2beefa70060c" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.394 [INFO][3991] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.394 [INFO][3991] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.773 [INFO][3997] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.774 [INFO][3997] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.775 [INFO][3997] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.804 [WARNING][3997] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.804 [INFO][3997] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.812 [INFO][3997] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:33.823607 containerd[1493]: 2025-01-17 13:44:33.818 [INFO][3991] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:33.830308 systemd[1]: run-netns-cni\x2df10e96c4\x2d53aa\x2d6b1a\x2dc1aa\x2d2beefa70060c.mount: Deactivated successfully. Jan 17 13:44:33.844220 containerd[1493]: time="2025-01-17T13:44:33.843255429Z" level=info msg="TearDown network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" successfully" Jan 17 13:44:33.844220 containerd[1493]: time="2025-01-17T13:44:33.843342806Z" level=info msg="StopPodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" returns successfully" Jan 17 13:44:33.870227 containerd[1493]: time="2025-01-17T13:44:33.866071919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9fwmg,Uid:41e59e9c-f5c4-48af-a614-7a43cf86d00d,Namespace:calico-system,Attempt:1,}" Jan 17 13:44:33.869940 systemd[1]: run-containerd-runc-k8s.io-669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271-runc.SNN38B.mount: Deactivated successfully. Jan 17 13:44:34.294226 containerd[1493]: time="2025-01-17T13:44:34.288690064Z" level=info msg="StopPodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\"" Jan 17 13:44:34.290425 systemd-networkd[1410]: calieb44a8c3a5d: Link UP Jan 17 13:44:34.290782 systemd-networkd[1410]: calieb44a8c3a5d: Gained carrier Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.066 [INFO][4105] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.094 [INFO][4105] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0 csi-node-driver- calico-system 41e59e9c-f5c4-48af-a614-7a43cf86d00d 794 0 2025-01-17 13:44:04 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-so9hk.gb1.brightbox.com csi-node-driver-9fwmg eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calieb44a8c3a5d [] []}} ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.095 [INFO][4105] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.165 [INFO][4125] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" HandleID="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.187 [INFO][4125] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" HandleID="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a5860), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-so9hk.gb1.brightbox.com", "pod":"csi-node-driver-9fwmg", "timestamp":"2025-01-17 13:44:34.165161543 +0000 UTC"}, Hostname:"srv-so9hk.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.188 [INFO][4125] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.188 [INFO][4125] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.188 [INFO][4125] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-so9hk.gb1.brightbox.com' Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.191 [INFO][4125] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.205 [INFO][4125] ipam/ipam.go 372: Looking up existing affinities for host host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.218 [INFO][4125] ipam/ipam.go 489: Trying affinity for 192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.222 [INFO][4125] ipam/ipam.go 155: Attempting to load block cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.226 [INFO][4125] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.226 [INFO][4125] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.57.192/26 handle="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.229 [INFO][4125] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3 Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.236 [INFO][4125] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.57.192/26 handle="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.250 [INFO][4125] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.57.193/26] block=192.168.57.192/26 handle="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.250 [INFO][4125] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.57.193/26] handle="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.251 [INFO][4125] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:34.327771 containerd[1493]: 2025-01-17 13:44:34.251 [INFO][4125] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.193/26] IPv6=[] ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" HandleID="k8s-pod-network.e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.336473 containerd[1493]: 2025-01-17 13:44:34.255 [INFO][4105] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"41e59e9c-f5c4-48af-a614-7a43cf86d00d", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-9fwmg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb44a8c3a5d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:34.336473 containerd[1493]: 2025-01-17 13:44:34.255 [INFO][4105] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.57.193/32] ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.336473 containerd[1493]: 2025-01-17 13:44:34.255 [INFO][4105] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb44a8c3a5d ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.336473 containerd[1493]: 2025-01-17 13:44:34.282 [INFO][4105] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.336473 containerd[1493]: 2025-01-17 13:44:34.286 [INFO][4105] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"41e59e9c-f5c4-48af-a614-7a43cf86d00d", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3", Pod:"csi-node-driver-9fwmg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb44a8c3a5d", MAC:"62:b6:8d:1b:8d:01", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:34.336473 containerd[1493]: 2025-01-17 13:44:34.323 [INFO][4105] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3" Namespace="calico-system" Pod="csi-node-driver-9fwmg" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:34.550413 containerd[1493]: time="2025-01-17T13:44:34.549495835Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:34.550413 containerd[1493]: time="2025-01-17T13:44:34.549622589Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:34.550413 containerd[1493]: time="2025-01-17T13:44:34.549690465Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:34.550413 containerd[1493]: time="2025-01-17T13:44:34.549882910Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:34.617153 systemd[1]: Started cri-containerd-e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3.scope - libcontainer container e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3. Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.556 [INFO][4152] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.556 [INFO][4152] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" iface="eth0" netns="/var/run/netns/cni-bef3b4bb-7d0b-a7e8-7f06-e2bb7dce4970" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.563 [INFO][4152] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" iface="eth0" netns="/var/run/netns/cni-bef3b4bb-7d0b-a7e8-7f06-e2bb7dce4970" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.564 [INFO][4152] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" iface="eth0" netns="/var/run/netns/cni-bef3b4bb-7d0b-a7e8-7f06-e2bb7dce4970" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.564 [INFO][4152] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.565 [INFO][4152] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.645 [INFO][4212] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.645 [INFO][4212] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.645 [INFO][4212] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.656 [WARNING][4212] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.656 [INFO][4212] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.660 [INFO][4212] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:34.676764 containerd[1493]: 2025-01-17 13:44:34.663 [INFO][4152] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:34.679788 containerd[1493]: time="2025-01-17T13:44:34.677575001Z" level=info msg="TearDown network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" successfully" Jan 17 13:44:34.679788 containerd[1493]: time="2025-01-17T13:44:34.677622188Z" level=info msg="StopPodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" returns successfully" Jan 17 13:44:34.679788 containerd[1493]: time="2025-01-17T13:44:34.679098424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nlgqc,Uid:aede887a-58e3-4229-a42e-9f14a0e48c86,Namespace:kube-system,Attempt:1,}" Jan 17 13:44:34.693236 kernel: bpftool[4232]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 17 13:44:34.826958 containerd[1493]: time="2025-01-17T13:44:34.826740563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9fwmg,Uid:41e59e9c-f5c4-48af-a614-7a43cf86d00d,Namespace:calico-system,Attempt:1,} returns sandbox id \"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3\"" Jan 17 13:44:34.838190 systemd[1]: run-netns-cni\x2dbef3b4bb\x2d7d0b\x2da7e8\x2d7f06\x2de2bb7dce4970.mount: Deactivated successfully. Jan 17 13:44:34.842422 containerd[1493]: time="2025-01-17T13:44:34.842389418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 17 13:44:34.877102 systemd[1]: run-containerd-runc-k8s.io-669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271-runc.LOiOzL.mount: Deactivated successfully. Jan 17 13:44:35.035091 systemd-networkd[1410]: calicfcbf8534c4: Link UP Jan 17 13:44:35.041080 systemd-networkd[1410]: calicfcbf8534c4: Gained carrier Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.883 [INFO][4241] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0 coredns-7db6d8ff4d- kube-system aede887a-58e3-4229-a42e-9f14a0e48c86 801 0 2025-01-17 13:43:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-so9hk.gb1.brightbox.com coredns-7db6d8ff4d-nlgqc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicfcbf8534c4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.884 [INFO][4241] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.960 [INFO][4275] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" HandleID="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.974 [INFO][4275] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" HandleID="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318780), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-so9hk.gb1.brightbox.com", "pod":"coredns-7db6d8ff4d-nlgqc", "timestamp":"2025-01-17 13:44:34.958160066 +0000 UTC"}, Hostname:"srv-so9hk.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.975 [INFO][4275] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.975 [INFO][4275] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.975 [INFO][4275] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-so9hk.gb1.brightbox.com' Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.978 [INFO][4275] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.990 [INFO][4275] ipam/ipam.go 372: Looking up existing affinities for host host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:34.998 [INFO][4275] ipam/ipam.go 489: Trying affinity for 192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.001 [INFO][4275] ipam/ipam.go 155: Attempting to load block cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.005 [INFO][4275] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.005 [INFO][4275] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.57.192/26 handle="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.007 [INFO][4275] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.013 [INFO][4275] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.57.192/26 handle="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.026 [INFO][4275] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.57.194/26] block=192.168.57.192/26 handle="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.026 [INFO][4275] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.57.194/26] handle="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.026 [INFO][4275] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:35.066726 containerd[1493]: 2025-01-17 13:44:35.026 [INFO][4275] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.194/26] IPv6=[] ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" HandleID="k8s-pod-network.74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.068332 containerd[1493]: 2025-01-17 13:44:35.029 [INFO][4241] cni-plugin/k8s.go 386: Populated endpoint ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"aede887a-58e3-4229-a42e-9f14a0e48c86", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7db6d8ff4d-nlgqc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfcbf8534c4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:35.068332 containerd[1493]: 2025-01-17 13:44:35.030 [INFO][4241] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.57.194/32] ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.068332 containerd[1493]: 2025-01-17 13:44:35.030 [INFO][4241] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicfcbf8534c4 ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.068332 containerd[1493]: 2025-01-17 13:44:35.038 [INFO][4241] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.068332 containerd[1493]: 2025-01-17 13:44:35.041 [INFO][4241] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"aede887a-58e3-4229-a42e-9f14a0e48c86", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e", Pod:"coredns-7db6d8ff4d-nlgqc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfcbf8534c4", MAC:"22:de:49:52:0a:77", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:35.068332 containerd[1493]: 2025-01-17 13:44:35.059 [INFO][4241] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e" Namespace="kube-system" Pod="coredns-7db6d8ff4d-nlgqc" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:35.130962 containerd[1493]: time="2025-01-17T13:44:35.130404795Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:35.132423 containerd[1493]: time="2025-01-17T13:44:35.130507085Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:35.133013 containerd[1493]: time="2025-01-17T13:44:35.132565120Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:35.133013 containerd[1493]: time="2025-01-17T13:44:35.132739358Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:35.193871 systemd[1]: Started cri-containerd-74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e.scope - libcontainer container 74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e. Jan 17 13:44:35.275811 containerd[1493]: time="2025-01-17T13:44:35.275712483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-nlgqc,Uid:aede887a-58e3-4229-a42e-9f14a0e48c86,Namespace:kube-system,Attempt:1,} returns sandbox id \"74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e\"" Jan 17 13:44:35.288052 containerd[1493]: time="2025-01-17T13:44:35.285815214Z" level=info msg="StopPodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\"" Jan 17 13:44:35.288052 containerd[1493]: time="2025-01-17T13:44:35.285909917Z" level=info msg="StopPodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\"" Jan 17 13:44:35.291613 containerd[1493]: time="2025-01-17T13:44:35.291544552Z" level=info msg="StopPodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\"" Jan 17 13:44:35.292616 containerd[1493]: time="2025-01-17T13:44:35.292525682Z" level=info msg="CreateContainer within sandbox \"74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 17 13:44:35.377618 containerd[1493]: time="2025-01-17T13:44:35.377048154Z" level=info msg="CreateContainer within sandbox \"74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d9e25fb118f3369d17311c07b9c45ca074760e5e04178ef0ab49257847347745\"" Jan 17 13:44:35.382298 containerd[1493]: time="2025-01-17T13:44:35.380728326Z" level=info msg="StartContainer for \"d9e25fb118f3369d17311c07b9c45ca074760e5e04178ef0ab49257847347745\"" Jan 17 13:44:35.447375 systemd-networkd[1410]: calieb44a8c3a5d: Gained IPv6LL Jan 17 13:44:35.505506 systemd[1]: Started cri-containerd-d9e25fb118f3369d17311c07b9c45ca074760e5e04178ef0ab49257847347745.scope - libcontainer container d9e25fb118f3369d17311c07b9c45ca074760e5e04178ef0ab49257847347745. Jan 17 13:44:35.660014 containerd[1493]: time="2025-01-17T13:44:35.658057290Z" level=info msg="StartContainer for \"d9e25fb118f3369d17311c07b9c45ca074760e5e04178ef0ab49257847347745\" returns successfully" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.583 [INFO][4384] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.583 [INFO][4384] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" iface="eth0" netns="/var/run/netns/cni-5d8cfaf7-cdee-befc-33fc-bdd6292c0c28" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.584 [INFO][4384] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" iface="eth0" netns="/var/run/netns/cni-5d8cfaf7-cdee-befc-33fc-bdd6292c0c28" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.585 [INFO][4384] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" iface="eth0" netns="/var/run/netns/cni-5d8cfaf7-cdee-befc-33fc-bdd6292c0c28" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.585 [INFO][4384] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.586 [INFO][4384] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.693 [INFO][4439] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.693 [INFO][4439] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.694 [INFO][4439] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.717 [WARNING][4439] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.717 [INFO][4439] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.720 [INFO][4439] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:35.732050 containerd[1493]: 2025-01-17 13:44:35.727 [INFO][4384] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:35.734568 containerd[1493]: time="2025-01-17T13:44:35.733108041Z" level=info msg="TearDown network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" successfully" Jan 17 13:44:35.734568 containerd[1493]: time="2025-01-17T13:44:35.733233029Z" level=info msg="StopPodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" returns successfully" Jan 17 13:44:35.736314 containerd[1493]: time="2025-01-17T13:44:35.736142139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-f9vfx,Uid:a205ba3a-791a-42f3-a1d4-a853e21d6652,Namespace:calico-apiserver,Attempt:1,}" Jan 17 13:44:35.837650 systemd[1]: run-containerd-runc-k8s.io-74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e-runc.kXPTaX.mount: Deactivated successfully. Jan 17 13:44:35.837830 systemd[1]: run-netns-cni\x2d5d8cfaf7\x2dcdee\x2dbefc\x2d33fc\x2dbdd6292c0c28.mount: Deactivated successfully. Jan 17 13:44:35.852620 systemd-networkd[1410]: vxlan.calico: Link UP Jan 17 13:44:35.852825 systemd-networkd[1410]: vxlan.calico: Gained carrier Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.566 [INFO][4382] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.566 [INFO][4382] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" iface="eth0" netns="/var/run/netns/cni-c50d3c6c-6e8d-2a32-6b3e-1b18ef500438" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.566 [INFO][4382] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" iface="eth0" netns="/var/run/netns/cni-c50d3c6c-6e8d-2a32-6b3e-1b18ef500438" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.567 [INFO][4382] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" iface="eth0" netns="/var/run/netns/cni-c50d3c6c-6e8d-2a32-6b3e-1b18ef500438" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.567 [INFO][4382] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.568 [INFO][4382] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.777 [INFO][4434] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.777 [INFO][4434] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.778 [INFO][4434] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.848 [WARNING][4434] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.848 [INFO][4434] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.853 [INFO][4434] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:35.879341 containerd[1493]: 2025-01-17 13:44:35.862 [INFO][4382] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:35.882780 containerd[1493]: time="2025-01-17T13:44:35.882535652Z" level=info msg="TearDown network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" successfully" Jan 17 13:44:35.883018 containerd[1493]: time="2025-01-17T13:44:35.882783313Z" level=info msg="StopPodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" returns successfully" Jan 17 13:44:35.887562 containerd[1493]: time="2025-01-17T13:44:35.887528644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-xbj84,Uid:3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3,Namespace:calico-apiserver,Attempt:1,}" Jan 17 13:44:35.897960 systemd[1]: run-netns-cni\x2dc50d3c6c\x2d6e8d\x2d2a32\x2d6b3e\x2d1b18ef500438.mount: Deactivated successfully. Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.587 [INFO][4385] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.587 [INFO][4385] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" iface="eth0" netns="/var/run/netns/cni-36aa9378-35f6-be59-912a-41a6e131c37e" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.590 [INFO][4385] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" iface="eth0" netns="/var/run/netns/cni-36aa9378-35f6-be59-912a-41a6e131c37e" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.590 [INFO][4385] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" iface="eth0" netns="/var/run/netns/cni-36aa9378-35f6-be59-912a-41a6e131c37e" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.590 [INFO][4385] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.590 [INFO][4385] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.789 [INFO][4442] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.793 [INFO][4442] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.853 [INFO][4442] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.900 [WARNING][4442] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.901 [INFO][4442] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.930 [INFO][4442] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:35.951526 containerd[1493]: 2025-01-17 13:44:35.936 [INFO][4385] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:35.957068 containerd[1493]: time="2025-01-17T13:44:35.956994390Z" level=info msg="TearDown network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" successfully" Jan 17 13:44:35.957441 containerd[1493]: time="2025-01-17T13:44:35.957036010Z" level=info msg="StopPodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" returns successfully" Jan 17 13:44:35.963231 containerd[1493]: time="2025-01-17T13:44:35.962905949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-848c57f8ff-m75gk,Uid:a817919e-4a2a-40c6-a08c-acbbfe256250,Namespace:calico-system,Attempt:1,}" Jan 17 13:44:36.286263 containerd[1493]: time="2025-01-17T13:44:36.286068365Z" level=info msg="StopPodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\"" Jan 17 13:44:36.391775 systemd-networkd[1410]: calicfa90ebbbb1: Link UP Jan 17 13:44:36.403875 systemd-networkd[1410]: calicfa90ebbbb1: Gained carrier Jan 17 13:44:36.409498 systemd-networkd[1410]: calicfcbf8534c4: Gained IPv6LL Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.019 [INFO][4489] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0 calico-apiserver-55b8dc98c7- calico-apiserver 3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3 814 0 2025-01-17 13:44:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55b8dc98c7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-so9hk.gb1.brightbox.com calico-apiserver-55b8dc98c7-xbj84 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicfa90ebbbb1 [] []}} ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.022 [INFO][4489] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.180 [INFO][4516] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" HandleID="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.235 [INFO][4516] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" HandleID="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004ca5f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-so9hk.gb1.brightbox.com", "pod":"calico-apiserver-55b8dc98c7-xbj84", "timestamp":"2025-01-17 13:44:36.180324713 +0000 UTC"}, Hostname:"srv-so9hk.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.236 [INFO][4516] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.236 [INFO][4516] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.236 [INFO][4516] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-so9hk.gb1.brightbox.com' Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.243 [INFO][4516] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.255 [INFO][4516] ipam/ipam.go 372: Looking up existing affinities for host host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.281 [INFO][4516] ipam/ipam.go 489: Trying affinity for 192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.302 [INFO][4516] ipam/ipam.go 155: Attempting to load block cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.307 [INFO][4516] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.307 [INFO][4516] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.57.192/26 handle="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.312 [INFO][4516] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7 Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.322 [INFO][4516] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.57.192/26 handle="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.344 [INFO][4516] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.57.195/26] block=192.168.57.192/26 handle="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.344 [INFO][4516] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.57.195/26] handle="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.345 [INFO][4516] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:36.461527 containerd[1493]: 2025-01-17 13:44:36.346 [INFO][4516] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.195/26] IPv6=[] ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" HandleID="k8s-pod-network.9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.463831 containerd[1493]: 2025-01-17 13:44:36.357 [INFO][4489] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-55b8dc98c7-xbj84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicfa90ebbbb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:36.463831 containerd[1493]: 2025-01-17 13:44:36.357 [INFO][4489] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.57.195/32] ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.463831 containerd[1493]: 2025-01-17 13:44:36.357 [INFO][4489] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicfa90ebbbb1 ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.463831 containerd[1493]: 2025-01-17 13:44:36.411 [INFO][4489] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.463831 containerd[1493]: 2025-01-17 13:44:36.426 [INFO][4489] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7", Pod:"calico-apiserver-55b8dc98c7-xbj84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicfa90ebbbb1", MAC:"3e:b0:ab:7c:57:33", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:36.463831 containerd[1493]: 2025-01-17 13:44:36.450 [INFO][4489] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-xbj84" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:36.588433 containerd[1493]: time="2025-01-17T13:44:36.588113076Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:36.591734 containerd[1493]: time="2025-01-17T13:44:36.588652963Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:36.592334 containerd[1493]: time="2025-01-17T13:44:36.592003919Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:36.592334 containerd[1493]: time="2025-01-17T13:44:36.592213047Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:36.642404 systemd[1]: Started cri-containerd-9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7.scope - libcontainer container 9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7. Jan 17 13:44:36.811628 kubelet[2728]: I0117 13:44:36.811329 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-nlgqc" podStartSLOduration=39.811015495 podStartE2EDuration="39.811015495s" podCreationTimestamp="2025-01-17 13:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-17 13:44:36.810585423 +0000 UTC m=+54.798164016" watchObservedRunningTime="2025-01-17 13:44:36.811015495 +0000 UTC m=+54.798594060" Jan 17 13:44:36.838640 systemd[1]: run-netns-cni\x2d36aa9378\x2d35f6\x2dbe59\x2d912a\x2d41a6e131c37e.mount: Deactivated successfully. Jan 17 13:44:37.000780 systemd-networkd[1410]: cali5ed59e3e92a: Link UP Jan 17 13:44:37.004876 systemd-networkd[1410]: cali5ed59e3e92a: Gained carrier Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.253 [INFO][4471] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0 calico-apiserver-55b8dc98c7- calico-apiserver a205ba3a-791a-42f3-a1d4-a853e21d6652 815 0 2025-01-17 13:44:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55b8dc98c7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-so9hk.gb1.brightbox.com calico-apiserver-55b8dc98c7-f9vfx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5ed59e3e92a [] []}} ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.253 [INFO][4471] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.613 [INFO][4553] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" HandleID="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.694 [INFO][4553] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" HandleID="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000392970), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-so9hk.gb1.brightbox.com", "pod":"calico-apiserver-55b8dc98c7-f9vfx", "timestamp":"2025-01-17 13:44:36.612990898 +0000 UTC"}, Hostname:"srv-so9hk.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.694 [INFO][4553] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.694 [INFO][4553] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.694 [INFO][4553] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-so9hk.gb1.brightbox.com' Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.702 [INFO][4553] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.740 [INFO][4553] ipam/ipam.go 372: Looking up existing affinities for host host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.788 [INFO][4553] ipam/ipam.go 489: Trying affinity for 192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.810 [INFO][4553] ipam/ipam.go 155: Attempting to load block cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.857 [INFO][4553] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.857 [INFO][4553] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.57.192/26 handle="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.930 [INFO][4553] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98 Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.944 [INFO][4553] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.57.192/26 handle="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.978 [INFO][4553] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.57.196/26] block=192.168.57.192/26 handle="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.978 [INFO][4553] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.57.196/26] handle="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.978 [INFO][4553] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:37.038298 containerd[1493]: 2025-01-17 13:44:36.978 [INFO][4553] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.196/26] IPv6=[] ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" HandleID="k8s-pod-network.c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.043132 containerd[1493]: 2025-01-17 13:44:36.989 [INFO][4471] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"a205ba3a-791a-42f3-a1d4-a853e21d6652", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-55b8dc98c7-f9vfx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5ed59e3e92a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:37.043132 containerd[1493]: 2025-01-17 13:44:36.989 [INFO][4471] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.57.196/32] ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.043132 containerd[1493]: 2025-01-17 13:44:36.989 [INFO][4471] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5ed59e3e92a ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.043132 containerd[1493]: 2025-01-17 13:44:37.006 [INFO][4471] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.043132 containerd[1493]: 2025-01-17 13:44:37.007 [INFO][4471] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"a205ba3a-791a-42f3-a1d4-a853e21d6652", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98", Pod:"calico-apiserver-55b8dc98c7-f9vfx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5ed59e3e92a", MAC:"c2:45:9c:d8:ac:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:37.043132 containerd[1493]: 2025-01-17 13:44:37.030 [INFO][4471] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98" Namespace="calico-apiserver" Pod="calico-apiserver-55b8dc98c7-f9vfx" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:37.173847 systemd-networkd[1410]: vxlan.calico: Gained IPv6LL Jan 17 13:44:37.197363 containerd[1493]: time="2025-01-17T13:44:37.196011248Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:37.197363 containerd[1493]: time="2025-01-17T13:44:37.196299272Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:37.197363 containerd[1493]: time="2025-01-17T13:44:37.196329472Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:37.197613 containerd[1493]: time="2025-01-17T13:44:37.197481596Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:37.242971 systemd-networkd[1410]: cali784966dd6a8: Link UP Jan 17 13:44:37.243466 systemd-networkd[1410]: cali784966dd6a8: Gained carrier Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.665 [INFO][4555] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.668 [INFO][4555] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" iface="eth0" netns="/var/run/netns/cni-ff167f5e-9763-76b8-8c00-64e5b8954282" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.669 [INFO][4555] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" iface="eth0" netns="/var/run/netns/cni-ff167f5e-9763-76b8-8c00-64e5b8954282" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.672 [INFO][4555] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" iface="eth0" netns="/var/run/netns/cni-ff167f5e-9763-76b8-8c00-64e5b8954282" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.672 [INFO][4555] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.672 [INFO][4555] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.822 [INFO][4617] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:36.822 [INFO][4617] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:37.207 [INFO][4617] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:37.250 [WARNING][4617] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:37.250 [INFO][4617] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:37.315 [INFO][4617] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:37.341228 containerd[1493]: 2025-01-17 13:44:37.327 [INFO][4555] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:37.355288 containerd[1493]: time="2025-01-17T13:44:37.350826939Z" level=info msg="TearDown network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" successfully" Jan 17 13:44:37.355288 containerd[1493]: time="2025-01-17T13:44:37.352494158Z" level=info msg="StopPodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" returns successfully" Jan 17 13:44:37.353033 systemd[1]: Started cri-containerd-c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98.scope - libcontainer container c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98. Jan 17 13:44:37.363018 systemd[1]: run-netns-cni\x2dff167f5e\x2d9763\x2d76b8\x2d8c00\x2d64e5b8954282.mount: Deactivated successfully. Jan 17 13:44:37.370938 containerd[1493]: time="2025-01-17T13:44:37.369918503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-267m7,Uid:17460e26-fb78-4395-87e8-c326be51e429,Namespace:kube-system,Attempt:1,}" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.310 [INFO][4521] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0 calico-kube-controllers-848c57f8ff- calico-system a817919e-4a2a-40c6-a08c-acbbfe256250 816 0 2025-01-17 13:44:05 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:848c57f8ff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-so9hk.gb1.brightbox.com calico-kube-controllers-848c57f8ff-m75gk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali784966dd6a8 [] []}} ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.310 [INFO][4521] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.610 [INFO][4562] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" HandleID="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.709 [INFO][4562] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" HandleID="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000397440), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-so9hk.gb1.brightbox.com", "pod":"calico-kube-controllers-848c57f8ff-m75gk", "timestamp":"2025-01-17 13:44:36.610211345 +0000 UTC"}, Hostname:"srv-so9hk.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.709 [INFO][4562] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.981 [INFO][4562] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.981 [INFO][4562] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-so9hk.gb1.brightbox.com' Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:36.988 [INFO][4562] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.032 [INFO][4562] ipam/ipam.go 372: Looking up existing affinities for host host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.072 [INFO][4562] ipam/ipam.go 489: Trying affinity for 192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.088 [INFO][4562] ipam/ipam.go 155: Attempting to load block cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.096 [INFO][4562] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.096 [INFO][4562] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.57.192/26 handle="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.102 [INFO][4562] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023 Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.129 [INFO][4562] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.57.192/26 handle="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.207 [INFO][4562] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.57.197/26] block=192.168.57.192/26 handle="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.207 [INFO][4562] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.57.197/26] handle="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.207 [INFO][4562] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:37.378865 containerd[1493]: 2025-01-17 13:44:37.207 [INFO][4562] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.197/26] IPv6=[] ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" HandleID="k8s-pod-network.3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.380409 containerd[1493]: 2025-01-17 13:44:37.219 [INFO][4521] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0", GenerateName:"calico-kube-controllers-848c57f8ff-", Namespace:"calico-system", SelfLink:"", UID:"a817919e-4a2a-40c6-a08c-acbbfe256250", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"848c57f8ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-848c57f8ff-m75gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali784966dd6a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:37.380409 containerd[1493]: 2025-01-17 13:44:37.219 [INFO][4521] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.57.197/32] ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.380409 containerd[1493]: 2025-01-17 13:44:37.219 [INFO][4521] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali784966dd6a8 ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.380409 containerd[1493]: 2025-01-17 13:44:37.247 [INFO][4521] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.380409 containerd[1493]: 2025-01-17 13:44:37.252 [INFO][4521] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0", GenerateName:"calico-kube-controllers-848c57f8ff-", Namespace:"calico-system", SelfLink:"", UID:"a817919e-4a2a-40c6-a08c-acbbfe256250", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"848c57f8ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023", Pod:"calico-kube-controllers-848c57f8ff-m75gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali784966dd6a8", MAC:"4e:db:d2:3f:b5:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:37.380409 containerd[1493]: 2025-01-17 13:44:37.347 [INFO][4521] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023" Namespace="calico-system" Pod="calico-kube-controllers-848c57f8ff-m75gk" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:37.496307 containerd[1493]: time="2025-01-17T13:44:37.495988420Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:37.496307 containerd[1493]: time="2025-01-17T13:44:37.496105675Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:37.499082 containerd[1493]: time="2025-01-17T13:44:37.496146562Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:37.499082 containerd[1493]: time="2025-01-17T13:44:37.496391170Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:37.601852 containerd[1493]: time="2025-01-17T13:44:37.601766884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-xbj84,Uid:3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7\"" Jan 17 13:44:37.647151 systemd[1]: Started cri-containerd-3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023.scope - libcontainer container 3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023. Jan 17 13:44:37.677198 containerd[1493]: time="2025-01-17T13:44:37.676969689Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55b8dc98c7-f9vfx,Uid:a205ba3a-791a-42f3-a1d4-a853e21d6652,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98\"" Jan 17 13:44:37.877562 systemd-networkd[1410]: calicfa90ebbbb1: Gained IPv6LL Jan 17 13:44:37.931257 containerd[1493]: time="2025-01-17T13:44:37.930060027Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:37.932542 containerd[1493]: time="2025-01-17T13:44:37.932488662Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 17 13:44:37.933648 containerd[1493]: time="2025-01-17T13:44:37.933614571Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:37.938596 containerd[1493]: time="2025-01-17T13:44:37.938561113Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:37.940302 containerd[1493]: time="2025-01-17T13:44:37.940267924Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 3.097661057s" Jan 17 13:44:37.941314 containerd[1493]: time="2025-01-17T13:44:37.941281832Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 17 13:44:37.946575 containerd[1493]: time="2025-01-17T13:44:37.946542824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 17 13:44:37.952826 containerd[1493]: time="2025-01-17T13:44:37.952792252Z" level=info msg="CreateContainer within sandbox \"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 17 13:44:37.972284 systemd-networkd[1410]: cali09a0803225a: Link UP Jan 17 13:44:37.972662 systemd-networkd[1410]: cali09a0803225a: Gained carrier Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.775 [INFO][4695] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0 coredns-7db6d8ff4d- kube-system 17460e26-fb78-4395-87e8-c326be51e429 824 0 2025-01-17 13:43:57 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-so9hk.gb1.brightbox.com coredns-7db6d8ff4d-267m7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali09a0803225a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.775 [INFO][4695] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.863 [INFO][4754] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" HandleID="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.895 [INFO][4754] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" HandleID="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000fa8a0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-so9hk.gb1.brightbox.com", "pod":"coredns-7db6d8ff4d-267m7", "timestamp":"2025-01-17 13:44:37.86392191 +0000 UTC"}, Hostname:"srv-so9hk.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.895 [INFO][4754] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.895 [INFO][4754] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.895 [INFO][4754] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-so9hk.gb1.brightbox.com' Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.898 [INFO][4754] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.911 [INFO][4754] ipam/ipam.go 372: Looking up existing affinities for host host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.925 [INFO][4754] ipam/ipam.go 489: Trying affinity for 192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.928 [INFO][4754] ipam/ipam.go 155: Attempting to load block cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.937 [INFO][4754] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.57.192/26 host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.937 [INFO][4754] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.57.192/26 handle="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.940 [INFO][4754] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.949 [INFO][4754] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.57.192/26 handle="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.963 [INFO][4754] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.57.198/26] block=192.168.57.192/26 handle="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.964 [INFO][4754] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.57.198/26] handle="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" host="srv-so9hk.gb1.brightbox.com" Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.964 [INFO][4754] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:38.008954 containerd[1493]: 2025-01-17 13:44:37.964 [INFO][4754] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.57.198/26] IPv6=[] ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" HandleID="k8s-pod-network.91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.015683 containerd[1493]: 2025-01-17 13:44:37.966 [INFO][4695] cni-plugin/k8s.go 386: Populated endpoint ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"17460e26-fb78-4395-87e8-c326be51e429", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7db6d8ff4d-267m7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09a0803225a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:38.015683 containerd[1493]: 2025-01-17 13:44:37.966 [INFO][4695] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.57.198/32] ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.015683 containerd[1493]: 2025-01-17 13:44:37.966 [INFO][4695] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali09a0803225a ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.015683 containerd[1493]: 2025-01-17 13:44:37.971 [INFO][4695] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.015683 containerd[1493]: 2025-01-17 13:44:37.973 [INFO][4695] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"17460e26-fb78-4395-87e8-c326be51e429", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca", Pod:"coredns-7db6d8ff4d-267m7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09a0803225a", MAC:"12:a6:44:dd:62:51", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:38.015683 containerd[1493]: 2025-01-17 13:44:37.988 [INFO][4695] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca" Namespace="kube-system" Pod="coredns-7db6d8ff4d-267m7" WorkloadEndpoint="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:38.025525 containerd[1493]: time="2025-01-17T13:44:38.023862108Z" level=info msg="CreateContainer within sandbox \"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"6d2ee6550601ee429bd8e65971c1c98464890457e3027fb75260ba119a706c68\"" Jan 17 13:44:38.025689 containerd[1493]: time="2025-01-17T13:44:38.025659046Z" level=info msg="StartContainer for \"6d2ee6550601ee429bd8e65971c1c98464890457e3027fb75260ba119a706c68\"" Jan 17 13:44:38.113712 containerd[1493]: time="2025-01-17T13:44:38.113355832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-848c57f8ff-m75gk,Uid:a817919e-4a2a-40c6-a08c-acbbfe256250,Namespace:calico-system,Attempt:1,} returns sandbox id \"3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023\"" Jan 17 13:44:38.166529 systemd[1]: Started cri-containerd-6d2ee6550601ee429bd8e65971c1c98464890457e3027fb75260ba119a706c68.scope - libcontainer container 6d2ee6550601ee429bd8e65971c1c98464890457e3027fb75260ba119a706c68. Jan 17 13:44:38.177812 containerd[1493]: time="2025-01-17T13:44:38.177658064Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 17 13:44:38.178662 containerd[1493]: time="2025-01-17T13:44:38.177785424Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 17 13:44:38.178662 containerd[1493]: time="2025-01-17T13:44:38.177818359Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:38.178662 containerd[1493]: time="2025-01-17T13:44:38.178015379Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 17 13:44:38.233034 systemd[1]: Started cri-containerd-91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca.scope - libcontainer container 91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca. Jan 17 13:44:38.322619 containerd[1493]: time="2025-01-17T13:44:38.322004006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-267m7,Uid:17460e26-fb78-4395-87e8-c326be51e429,Namespace:kube-system,Attempt:1,} returns sandbox id \"91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca\"" Jan 17 13:44:38.342551 containerd[1493]: time="2025-01-17T13:44:38.340259773Z" level=info msg="CreateContainer within sandbox \"91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 17 13:44:38.365402 containerd[1493]: time="2025-01-17T13:44:38.365300825Z" level=info msg="StartContainer for \"6d2ee6550601ee429bd8e65971c1c98464890457e3027fb75260ba119a706c68\" returns successfully" Jan 17 13:44:38.411260 containerd[1493]: time="2025-01-17T13:44:38.411164195Z" level=info msg="CreateContainer within sandbox \"91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b50b1bb640f0d4813f8ad0f3a959106c1ee331bd99d7597342d96cf3780fdc39\"" Jan 17 13:44:38.416057 containerd[1493]: time="2025-01-17T13:44:38.416010781Z" level=info msg="StartContainer for \"b50b1bb640f0d4813f8ad0f3a959106c1ee331bd99d7597342d96cf3780fdc39\"" Jan 17 13:44:38.480518 systemd[1]: Started cri-containerd-b50b1bb640f0d4813f8ad0f3a959106c1ee331bd99d7597342d96cf3780fdc39.scope - libcontainer container b50b1bb640f0d4813f8ad0f3a959106c1ee331bd99d7597342d96cf3780fdc39. Jan 17 13:44:38.523807 containerd[1493]: time="2025-01-17T13:44:38.523717206Z" level=info msg="StartContainer for \"b50b1bb640f0d4813f8ad0f3a959106c1ee331bd99d7597342d96cf3780fdc39\" returns successfully" Jan 17 13:44:38.718020 kubelet[2728]: I0117 13:44:38.716987 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-267m7" podStartSLOduration=41.716923623 podStartE2EDuration="41.716923623s" podCreationTimestamp="2025-01-17 13:43:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-17 13:44:38.716195065 +0000 UTC m=+56.703773666" watchObservedRunningTime="2025-01-17 13:44:38.716923623 +0000 UTC m=+56.704502188" Jan 17 13:44:38.774282 systemd-networkd[1410]: cali5ed59e3e92a: Gained IPv6LL Jan 17 13:44:38.965382 systemd-networkd[1410]: cali784966dd6a8: Gained IPv6LL Jan 17 13:44:39.734343 systemd-networkd[1410]: cali09a0803225a: Gained IPv6LL Jan 17 13:44:41.677455 containerd[1493]: time="2025-01-17T13:44:41.677350598Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:41.679790 containerd[1493]: time="2025-01-17T13:44:41.679410901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 17 13:44:41.680790 containerd[1493]: time="2025-01-17T13:44:41.680748091Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:41.684683 containerd[1493]: time="2025-01-17T13:44:41.684161749Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:41.685452 containerd[1493]: time="2025-01-17T13:44:41.685411121Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 3.738599182s" Jan 17 13:44:41.685544 containerd[1493]: time="2025-01-17T13:44:41.685473439Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 17 13:44:41.690591 containerd[1493]: time="2025-01-17T13:44:41.690520954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 17 13:44:41.694500 containerd[1493]: time="2025-01-17T13:44:41.693902770Z" level=info msg="CreateContainer within sandbox \"9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 17 13:44:41.716921 containerd[1493]: time="2025-01-17T13:44:41.716855780Z" level=info msg="CreateContainer within sandbox \"9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ab4d39302e4d144b15e384eb663d910ec80f72ee2034102ef591f60a651c2aad\"" Jan 17 13:44:41.717761 containerd[1493]: time="2025-01-17T13:44:41.717713490Z" level=info msg="StartContainer for \"ab4d39302e4d144b15e384eb663d910ec80f72ee2034102ef591f60a651c2aad\"" Jan 17 13:44:41.784551 systemd[1]: Started cri-containerd-ab4d39302e4d144b15e384eb663d910ec80f72ee2034102ef591f60a651c2aad.scope - libcontainer container ab4d39302e4d144b15e384eb663d910ec80f72ee2034102ef591f60a651c2aad. Jan 17 13:44:41.855647 containerd[1493]: time="2025-01-17T13:44:41.854900582Z" level=info msg="StartContainer for \"ab4d39302e4d144b15e384eb663d910ec80f72ee2034102ef591f60a651c2aad\" returns successfully" Jan 17 13:44:42.042306 containerd[1493]: time="2025-01-17T13:44:42.041411106Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:42.043829 containerd[1493]: time="2025-01-17T13:44:42.043780242Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 17 13:44:42.047912 containerd[1493]: time="2025-01-17T13:44:42.047860012Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 356.327447ms" Jan 17 13:44:42.048046 containerd[1493]: time="2025-01-17T13:44:42.048018742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 17 13:44:42.049868 containerd[1493]: time="2025-01-17T13:44:42.049659553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 17 13:44:42.053039 containerd[1493]: time="2025-01-17T13:44:42.052990236Z" level=info msg="CreateContainer within sandbox \"c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 17 13:44:42.077527 containerd[1493]: time="2025-01-17T13:44:42.077471573Z" level=info msg="CreateContainer within sandbox \"c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9fad6bc80670a41f1df62998a2f7c784997cf37d94f25bc1949e0b6c7ed76073\"" Jan 17 13:44:42.080486 containerd[1493]: time="2025-01-17T13:44:42.080367138Z" level=info msg="StartContainer for \"9fad6bc80670a41f1df62998a2f7c784997cf37d94f25bc1949e0b6c7ed76073\"" Jan 17 13:44:42.138406 systemd[1]: Started cri-containerd-9fad6bc80670a41f1df62998a2f7c784997cf37d94f25bc1949e0b6c7ed76073.scope - libcontainer container 9fad6bc80670a41f1df62998a2f7c784997cf37d94f25bc1949e0b6c7ed76073. Jan 17 13:44:42.221127 containerd[1493]: time="2025-01-17T13:44:42.221024364Z" level=info msg="StartContainer for \"9fad6bc80670a41f1df62998a2f7c784997cf37d94f25bc1949e0b6c7ed76073\" returns successfully" Jan 17 13:44:42.281126 containerd[1493]: time="2025-01-17T13:44:42.280910231Z" level=info msg="StopPodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\"" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.390 [WARNING][5025] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"17460e26-fb78-4395-87e8-c326be51e429", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca", Pod:"coredns-7db6d8ff4d-267m7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09a0803225a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.392 [INFO][5025] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.392 [INFO][5025] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" iface="eth0" netns="" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.392 [INFO][5025] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.392 [INFO][5025] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.487 [INFO][5034] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.488 [INFO][5034] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.488 [INFO][5034] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.500 [WARNING][5034] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.500 [INFO][5034] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.503 [INFO][5034] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:42.513423 containerd[1493]: 2025-01-17 13:44:42.509 [INFO][5025] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.515351 containerd[1493]: time="2025-01-17T13:44:42.515307694Z" level=info msg="TearDown network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" successfully" Jan 17 13:44:42.515444 containerd[1493]: time="2025-01-17T13:44:42.515369300Z" level=info msg="StopPodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" returns successfully" Jan 17 13:44:42.516754 containerd[1493]: time="2025-01-17T13:44:42.516721780Z" level=info msg="RemovePodSandbox for \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\"" Jan 17 13:44:42.516829 containerd[1493]: time="2025-01-17T13:44:42.516771720Z" level=info msg="Forcibly stopping sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\"" Jan 17 13:44:42.716596 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3048273051.mount: Deactivated successfully. Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.641 [WARNING][5054] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"17460e26-fb78-4395-87e8-c326be51e429", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"91dd35361aa9cf7489ca2beacf8e195e9726e03336326143249ce6464ca5b2ca", Pod:"coredns-7db6d8ff4d-267m7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09a0803225a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.642 [INFO][5054] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.642 [INFO][5054] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" iface="eth0" netns="" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.642 [INFO][5054] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.642 [INFO][5054] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.727 [INFO][5062] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.727 [INFO][5062] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.727 [INFO][5062] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.743 [WARNING][5062] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.743 [INFO][5062] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" HandleID="k8s-pod-network.0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--267m7-eth0" Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.748 [INFO][5062] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:42.754538 containerd[1493]: 2025-01-17 13:44:42.752 [INFO][5054] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3" Jan 17 13:44:42.756932 containerd[1493]: time="2025-01-17T13:44:42.754610399Z" level=info msg="TearDown network for sandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" successfully" Jan 17 13:44:42.804225 containerd[1493]: time="2025-01-17T13:44:42.802326253Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 17 13:44:42.804225 containerd[1493]: time="2025-01-17T13:44:42.802439206Z" level=info msg="RemovePodSandbox \"0d0ef7b2809a7082c029b3dd5b6743ba9160d621b76ae81b4f9d510234fe12c3\" returns successfully" Jan 17 13:44:42.805920 containerd[1493]: time="2025-01-17T13:44:42.805864790Z" level=info msg="StopPodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\"" Jan 17 13:44:42.821932 kubelet[2728]: I0117 13:44:42.821809 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-55b8dc98c7-f9vfx" podStartSLOduration=34.455613475 podStartE2EDuration="38.821752161s" podCreationTimestamp="2025-01-17 13:44:04 +0000 UTC" firstStartedPulling="2025-01-17 13:44:37.683003503 +0000 UTC m=+55.670582055" lastFinishedPulling="2025-01-17 13:44:42.049142187 +0000 UTC m=+60.036720741" observedRunningTime="2025-01-17 13:44:42.773515511 +0000 UTC m=+60.761094091" watchObservedRunningTime="2025-01-17 13:44:42.821752161 +0000 UTC m=+60.809330721" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:42.965 [WARNING][5085] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7", Pod:"calico-apiserver-55b8dc98c7-xbj84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicfa90ebbbb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:42.966 [INFO][5085] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:42.966 [INFO][5085] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" iface="eth0" netns="" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:42.966 [INFO][5085] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:42.967 [INFO][5085] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.040 [INFO][5094] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.042 [INFO][5094] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.042 [INFO][5094] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.061 [WARNING][5094] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.061 [INFO][5094] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.065 [INFO][5094] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:43.071595 containerd[1493]: 2025-01-17 13:44:43.068 [INFO][5085] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.071595 containerd[1493]: time="2025-01-17T13:44:43.071296850Z" level=info msg="TearDown network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" successfully" Jan 17 13:44:43.071595 containerd[1493]: time="2025-01-17T13:44:43.071340559Z" level=info msg="StopPodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" returns successfully" Jan 17 13:44:43.073158 containerd[1493]: time="2025-01-17T13:44:43.073122544Z" level=info msg="RemovePodSandbox for \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\"" Jan 17 13:44:43.073296 containerd[1493]: time="2025-01-17T13:44:43.073164857Z" level=info msg="Forcibly stopping sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\"" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.158 [WARNING][5113] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"3dbbcf76-ddc3-459a-8f0a-cf5ce58b5fe3", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"9a4d6bbfd2bba37922ae26b2dc0a97e81d828ef747f2ed37321894358f0df3a7", Pod:"calico-apiserver-55b8dc98c7-xbj84", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicfa90ebbbb1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.159 [INFO][5113] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.159 [INFO][5113] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" iface="eth0" netns="" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.160 [INFO][5113] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.160 [INFO][5113] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.217 [INFO][5119] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.218 [INFO][5119] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.218 [INFO][5119] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.230 [WARNING][5119] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.230 [INFO][5119] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" HandleID="k8s-pod-network.fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--xbj84-eth0" Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.232 [INFO][5119] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:43.237240 containerd[1493]: 2025-01-17 13:44:43.235 [INFO][5113] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf" Jan 17 13:44:43.237240 containerd[1493]: time="2025-01-17T13:44:43.237207724Z" level=info msg="TearDown network for sandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" successfully" Jan 17 13:44:43.243813 containerd[1493]: time="2025-01-17T13:44:43.243774372Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 17 13:44:43.243921 containerd[1493]: time="2025-01-17T13:44:43.243847513Z" level=info msg="RemovePodSandbox \"fa831789d38b24f0308c480e807e3234903afb8df355ef7b997f55b2fde34ecf\" returns successfully" Jan 17 13:44:43.245689 containerd[1493]: time="2025-01-17T13:44:43.245649065Z" level=info msg="StopPodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\"" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.349 [WARNING][5137] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0", GenerateName:"calico-kube-controllers-848c57f8ff-", Namespace:"calico-system", SelfLink:"", UID:"a817919e-4a2a-40c6-a08c-acbbfe256250", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"848c57f8ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023", Pod:"calico-kube-controllers-848c57f8ff-m75gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali784966dd6a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.350 [INFO][5137] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.350 [INFO][5137] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" iface="eth0" netns="" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.350 [INFO][5137] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.350 [INFO][5137] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.408 [INFO][5146] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.409 [INFO][5146] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.409 [INFO][5146] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.420 [WARNING][5146] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.420 [INFO][5146] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.422 [INFO][5146] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:43.426340 containerd[1493]: 2025-01-17 13:44:43.424 [INFO][5137] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.428284 containerd[1493]: time="2025-01-17T13:44:43.426403545Z" level=info msg="TearDown network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" successfully" Jan 17 13:44:43.428284 containerd[1493]: time="2025-01-17T13:44:43.426442826Z" level=info msg="StopPodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" returns successfully" Jan 17 13:44:43.428284 containerd[1493]: time="2025-01-17T13:44:43.427267161Z" level=info msg="RemovePodSandbox for \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\"" Jan 17 13:44:43.428284 containerd[1493]: time="2025-01-17T13:44:43.427305153Z" level=info msg="Forcibly stopping sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\"" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.493 [WARNING][5168] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0", GenerateName:"calico-kube-controllers-848c57f8ff-", Namespace:"calico-system", SelfLink:"", UID:"a817919e-4a2a-40c6-a08c-acbbfe256250", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"848c57f8ff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023", Pod:"calico-kube-controllers-848c57f8ff-m75gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.57.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali784966dd6a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.494 [INFO][5168] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.494 [INFO][5168] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" iface="eth0" netns="" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.494 [INFO][5168] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.494 [INFO][5168] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.572 [INFO][5176] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.572 [INFO][5176] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.572 [INFO][5176] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.591 [WARNING][5176] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.591 [INFO][5176] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" HandleID="k8s-pod-network.8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--kube--controllers--848c57f8ff--m75gk-eth0" Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.596 [INFO][5176] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:43.603383 containerd[1493]: 2025-01-17 13:44:43.599 [INFO][5168] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31" Jan 17 13:44:43.604552 containerd[1493]: time="2025-01-17T13:44:43.603436842Z" level=info msg="TearDown network for sandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" successfully" Jan 17 13:44:43.613865 containerd[1493]: time="2025-01-17T13:44:43.613809032Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 17 13:44:43.613951 containerd[1493]: time="2025-01-17T13:44:43.613895774Z" level=info msg="RemovePodSandbox \"8d15faa7b6885254c752284d132c2ea3641c5a9ea79702cc55248beebbc4ec31\" returns successfully" Jan 17 13:44:43.615401 containerd[1493]: time="2025-01-17T13:44:43.615369761Z" level=info msg="StopPodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\"" Jan 17 13:44:43.772390 kubelet[2728]: I0117 13:44:43.771161 2728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.743 [WARNING][5194] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"a205ba3a-791a-42f3-a1d4-a853e21d6652", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98", Pod:"calico-apiserver-55b8dc98c7-f9vfx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5ed59e3e92a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.745 [INFO][5194] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.745 [INFO][5194] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" iface="eth0" netns="" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.745 [INFO][5194] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.745 [INFO][5194] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.840 [INFO][5201] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.840 [INFO][5201] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.840 [INFO][5201] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.853 [WARNING][5201] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.853 [INFO][5201] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.856 [INFO][5201] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:43.867391 containerd[1493]: 2025-01-17 13:44:43.860 [INFO][5194] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:43.871552 containerd[1493]: time="2025-01-17T13:44:43.867449666Z" level=info msg="TearDown network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" successfully" Jan 17 13:44:43.871552 containerd[1493]: time="2025-01-17T13:44:43.867495651Z" level=info msg="StopPodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" returns successfully" Jan 17 13:44:43.871552 containerd[1493]: time="2025-01-17T13:44:43.868103229Z" level=info msg="RemovePodSandbox for \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\"" Jan 17 13:44:43.871552 containerd[1493]: time="2025-01-17T13:44:43.868138031Z" level=info msg="Forcibly stopping sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\"" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:43.992 [WARNING][5220] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0", GenerateName:"calico-apiserver-55b8dc98c7-", Namespace:"calico-apiserver", SelfLink:"", UID:"a205ba3a-791a-42f3-a1d4-a853e21d6652", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55b8dc98c7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"c202bc57ebd9305b6f18d693704424dfb49cd3fc4cbe465a51c3ddcac0a45f98", Pod:"calico-apiserver-55b8dc98c7-f9vfx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.57.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5ed59e3e92a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:43.995 [INFO][5220] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:43.995 [INFO][5220] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" iface="eth0" netns="" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:43.995 [INFO][5220] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:43.995 [INFO][5220] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.066 [INFO][5227] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.066 [INFO][5227] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.066 [INFO][5227] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.079 [WARNING][5227] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.079 [INFO][5227] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" HandleID="k8s-pod-network.ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Workload="srv--so9hk.gb1.brightbox.com-k8s-calico--apiserver--55b8dc98c7--f9vfx-eth0" Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.082 [INFO][5227] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:44.102962 containerd[1493]: 2025-01-17 13:44:44.094 [INFO][5220] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20" Jan 17 13:44:44.102962 containerd[1493]: time="2025-01-17T13:44:44.101556380Z" level=info msg="TearDown network for sandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" successfully" Jan 17 13:44:44.124233 containerd[1493]: time="2025-01-17T13:44:44.124176582Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 17 13:44:44.124493 containerd[1493]: time="2025-01-17T13:44:44.124458847Z" level=info msg="RemovePodSandbox \"ab143cd1745774fba6efbe513d90504e5f74c62c92760a76d6d0d40fe97bcc20\" returns successfully" Jan 17 13:44:44.125710 containerd[1493]: time="2025-01-17T13:44:44.125141063Z" level=info msg="StopPodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\"" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.241 [WARNING][5246] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"aede887a-58e3-4229-a42e-9f14a0e48c86", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e", Pod:"coredns-7db6d8ff4d-nlgqc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfcbf8534c4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.241 [INFO][5246] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.241 [INFO][5246] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" iface="eth0" netns="" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.241 [INFO][5246] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.241 [INFO][5246] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.330 [INFO][5253] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.331 [INFO][5253] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.331 [INFO][5253] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.343 [WARNING][5253] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.343 [INFO][5253] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.346 [INFO][5253] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:44.351584 containerd[1493]: 2025-01-17 13:44:44.349 [INFO][5246] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.384913 containerd[1493]: time="2025-01-17T13:44:44.352116185Z" level=info msg="TearDown network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" successfully" Jan 17 13:44:44.384913 containerd[1493]: time="2025-01-17T13:44:44.352152814Z" level=info msg="StopPodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" returns successfully" Jan 17 13:44:44.384913 containerd[1493]: time="2025-01-17T13:44:44.352794926Z" level=info msg="RemovePodSandbox for \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\"" Jan 17 13:44:44.384913 containerd[1493]: time="2025-01-17T13:44:44.352829925Z" level=info msg="Forcibly stopping sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\"" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.477 [WARNING][5276] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"aede887a-58e3-4229-a42e-9f14a0e48c86", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 43, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"74044c86950f11437bcb2bb6cd2748b5df11d21ba6e904c98c2b04f423e6c27e", Pod:"coredns-7db6d8ff4d-nlgqc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.57.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfcbf8534c4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.479 [INFO][5276] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.479 [INFO][5276] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" iface="eth0" netns="" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.479 [INFO][5276] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.479 [INFO][5276] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.541 [INFO][5282] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.542 [INFO][5282] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.542 [INFO][5282] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.561 [WARNING][5282] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.561 [INFO][5282] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" HandleID="k8s-pod-network.28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Workload="srv--so9hk.gb1.brightbox.com-k8s-coredns--7db6d8ff4d--nlgqc-eth0" Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.567 [INFO][5282] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:44.574488 containerd[1493]: 2025-01-17 13:44:44.569 [INFO][5276] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4" Jan 17 13:44:44.575642 containerd[1493]: time="2025-01-17T13:44:44.575114467Z" level=info msg="TearDown network for sandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" successfully" Jan 17 13:44:44.584926 containerd[1493]: time="2025-01-17T13:44:44.583543402Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 17 13:44:44.584926 containerd[1493]: time="2025-01-17T13:44:44.583674482Z" level=info msg="RemovePodSandbox \"28bb968a4b1416ada3f404283aad6b928335ca0009de46a5a2df62f02cc47ff4\" returns successfully" Jan 17 13:44:44.592993 containerd[1493]: time="2025-01-17T13:44:44.592731141Z" level=info msg="StopPodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\"" Jan 17 13:44:44.686575 kubelet[2728]: I0117 13:44:44.684283 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-55b8dc98c7-xbj84" podStartSLOduration=36.601763535 podStartE2EDuration="40.684260787s" podCreationTimestamp="2025-01-17 13:44:04 +0000 UTC" firstStartedPulling="2025-01-17 13:44:37.60734398 +0000 UTC m=+55.594922536" lastFinishedPulling="2025-01-17 13:44:41.689841217 +0000 UTC m=+59.677419788" observedRunningTime="2025-01-17 13:44:42.825249461 +0000 UTC m=+60.812828050" watchObservedRunningTime="2025-01-17 13:44:44.684260787 +0000 UTC m=+62.671839345" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.769 [WARNING][5301] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"41e59e9c-f5c4-48af-a614-7a43cf86d00d", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3", Pod:"csi-node-driver-9fwmg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb44a8c3a5d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.769 [INFO][5301] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.769 [INFO][5301] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" iface="eth0" netns="" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.769 [INFO][5301] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.769 [INFO][5301] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.847 [INFO][5308] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.848 [INFO][5308] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.848 [INFO][5308] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.863 [WARNING][5308] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.863 [INFO][5308] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.865 [INFO][5308] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:44.870987 containerd[1493]: 2025-01-17 13:44:44.868 [INFO][5301] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:44.870987 containerd[1493]: time="2025-01-17T13:44:44.870584307Z" level=info msg="TearDown network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" successfully" Jan 17 13:44:44.870987 containerd[1493]: time="2025-01-17T13:44:44.870621242Z" level=info msg="StopPodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" returns successfully" Jan 17 13:44:44.873958 containerd[1493]: time="2025-01-17T13:44:44.872693886Z" level=info msg="RemovePodSandbox for \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\"" Jan 17 13:44:44.873958 containerd[1493]: time="2025-01-17T13:44:44.872753635Z" level=info msg="Forcibly stopping sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\"" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.015 [WARNING][5328] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"41e59e9c-f5c4-48af-a614-7a43cf86d00d", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.January, 17, 13, 44, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-so9hk.gb1.brightbox.com", ContainerID:"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3", Pod:"csi-node-driver-9fwmg", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.57.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieb44a8c3a5d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.016 [INFO][5328] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.016 [INFO][5328] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" iface="eth0" netns="" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.016 [INFO][5328] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.016 [INFO][5328] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.136 [INFO][5334] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.136 [INFO][5334] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.136 [INFO][5334] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.150 [WARNING][5334] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.150 [INFO][5334] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" HandleID="k8s-pod-network.f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Workload="srv--so9hk.gb1.brightbox.com-k8s-csi--node--driver--9fwmg-eth0" Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.153 [INFO][5334] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 17 13:44:45.161646 containerd[1493]: 2025-01-17 13:44:45.157 [INFO][5328] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74" Jan 17 13:44:45.161646 containerd[1493]: time="2025-01-17T13:44:45.161541233Z" level=info msg="TearDown network for sandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" successfully" Jan 17 13:44:45.165806 containerd[1493]: time="2025-01-17T13:44:45.165758811Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 17 13:44:45.165915 containerd[1493]: time="2025-01-17T13:44:45.165836513Z" level=info msg="RemovePodSandbox \"f0383f4813f59beda2e4957383e033220775c219732ad532e7fc53690c3b7a74\" returns successfully" Jan 17 13:44:45.799231 containerd[1493]: time="2025-01-17T13:44:45.797854276Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:45.800144 containerd[1493]: time="2025-01-17T13:44:45.799647614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 17 13:44:45.802260 containerd[1493]: time="2025-01-17T13:44:45.801877474Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:45.809092 containerd[1493]: time="2025-01-17T13:44:45.806558237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:45.809590 containerd[1493]: time="2025-01-17T13:44:45.809554882Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 3.759851642s" Jan 17 13:44:45.809764 containerd[1493]: time="2025-01-17T13:44:45.809733792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 17 13:44:45.811657 containerd[1493]: time="2025-01-17T13:44:45.811425998Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 17 13:44:45.871744 containerd[1493]: time="2025-01-17T13:44:45.871574315Z" level=info msg="CreateContainer within sandbox \"3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 17 13:44:45.898805 containerd[1493]: time="2025-01-17T13:44:45.898624942Z" level=info msg="CreateContainer within sandbox \"3d2ebbd2dbbce14ae36297914a3a520d2465413922280b24979e95eb8b14d023\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"b194b615a4681e62fe2b10997609c74d426a26dff703a083179c3467755f263d\"" Jan 17 13:44:45.900380 containerd[1493]: time="2025-01-17T13:44:45.899467772Z" level=info msg="StartContainer for \"b194b615a4681e62fe2b10997609c74d426a26dff703a083179c3467755f263d\"" Jan 17 13:44:45.958451 systemd[1]: Started cri-containerd-b194b615a4681e62fe2b10997609c74d426a26dff703a083179c3467755f263d.scope - libcontainer container b194b615a4681e62fe2b10997609c74d426a26dff703a083179c3467755f263d. Jan 17 13:44:46.040305 containerd[1493]: time="2025-01-17T13:44:46.040256340Z" level=info msg="StartContainer for \"b194b615a4681e62fe2b10997609c74d426a26dff703a083179c3467755f263d\" returns successfully" Jan 17 13:44:46.844677 systemd[1]: run-containerd-runc-k8s.io-b194b615a4681e62fe2b10997609c74d426a26dff703a083179c3467755f263d-runc.6x2zGm.mount: Deactivated successfully. Jan 17 13:44:46.893095 kubelet[2728]: I0117 13:44:46.888070 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-848c57f8ff-m75gk" podStartSLOduration=34.195012346 podStartE2EDuration="41.888037081s" podCreationTimestamp="2025-01-17 13:44:05 +0000 UTC" firstStartedPulling="2025-01-17 13:44:38.117984935 +0000 UTC m=+56.105563487" lastFinishedPulling="2025-01-17 13:44:45.81100967 +0000 UTC m=+63.798588222" observedRunningTime="2025-01-17 13:44:46.838557195 +0000 UTC m=+64.826135770" watchObservedRunningTime="2025-01-17 13:44:46.888037081 +0000 UTC m=+64.875615642" Jan 17 13:44:48.362340 containerd[1493]: time="2025-01-17T13:44:48.361929998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:48.365370 containerd[1493]: time="2025-01-17T13:44:48.364552959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 17 13:44:48.366859 containerd[1493]: time="2025-01-17T13:44:48.366626575Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:48.372895 containerd[1493]: time="2025-01-17T13:44:48.372476201Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 17 13:44:48.376780 containerd[1493]: time="2025-01-17T13:44:48.376505306Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.565030152s" Jan 17 13:44:48.377762 containerd[1493]: time="2025-01-17T13:44:48.377713135Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 17 13:44:48.388626 containerd[1493]: time="2025-01-17T13:44:48.388139865Z" level=info msg="CreateContainer within sandbox \"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 17 13:44:48.452013 containerd[1493]: time="2025-01-17T13:44:48.451481661Z" level=info msg="CreateContainer within sandbox \"e11262d32add0c33a2b64a70e75b6561a1257be3946645d1757f0dd8557dcda3\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"91eb2cb5d464d9027b16f3daf0235225c9aec4b00f502cf4b746291ba3afb231\"" Jan 17 13:44:48.454220 containerd[1493]: time="2025-01-17T13:44:48.454150365Z" level=info msg="StartContainer for \"91eb2cb5d464d9027b16f3daf0235225c9aec4b00f502cf4b746291ba3afb231\"" Jan 17 13:44:48.582391 systemd[1]: Started cri-containerd-91eb2cb5d464d9027b16f3daf0235225c9aec4b00f502cf4b746291ba3afb231.scope - libcontainer container 91eb2cb5d464d9027b16f3daf0235225c9aec4b00f502cf4b746291ba3afb231. Jan 17 13:44:48.694664 containerd[1493]: time="2025-01-17T13:44:48.694568300Z" level=info msg="StartContainer for \"91eb2cb5d464d9027b16f3daf0235225c9aec4b00f502cf4b746291ba3afb231\" returns successfully" Jan 17 13:44:49.634320 kubelet[2728]: I0117 13:44:49.634242 2728 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 17 13:44:49.642348 kubelet[2728]: I0117 13:44:49.642321 2728 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 17 13:44:53.284013 kubelet[2728]: I0117 13:44:53.283885 2728 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9fwmg" podStartSLOduration=35.742163969 podStartE2EDuration="49.283621402s" podCreationTimestamp="2025-01-17 13:44:04 +0000 UTC" firstStartedPulling="2025-01-17 13:44:34.839716946 +0000 UTC m=+52.827295502" lastFinishedPulling="2025-01-17 13:44:48.381174384 +0000 UTC m=+66.368752935" observedRunningTime="2025-01-17 13:44:48.851568875 +0000 UTC m=+66.839147470" watchObservedRunningTime="2025-01-17 13:44:53.283621402 +0000 UTC m=+71.271199966" Jan 17 13:44:57.171348 systemd[1]: Started sshd@9-10.230.9.254:22-139.178.68.195:46864.service - OpenSSH per-connection server daemon (139.178.68.195:46864). Jan 17 13:44:58.159383 sshd[5491]: Accepted publickey for core from 139.178.68.195 port 46864 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:44:58.163315 sshd[5491]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:44:58.176453 systemd-logind[1480]: New session 12 of user core. Jan 17 13:44:58.182446 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 17 13:44:59.405220 sshd[5491]: pam_unix(sshd:session): session closed for user core Jan 17 13:44:59.415569 systemd[1]: sshd@9-10.230.9.254:22-139.178.68.195:46864.service: Deactivated successfully. Jan 17 13:44:59.424423 systemd[1]: session-12.scope: Deactivated successfully. Jan 17 13:44:59.428663 systemd-logind[1480]: Session 12 logged out. Waiting for processes to exit. Jan 17 13:44:59.432249 systemd-logind[1480]: Removed session 12. Jan 17 13:45:04.569725 systemd[1]: Started sshd@10-10.230.9.254:22-139.178.68.195:46868.service - OpenSSH per-connection server daemon (139.178.68.195:46868). Jan 17 13:45:05.486465 sshd[5516]: Accepted publickey for core from 139.178.68.195 port 46868 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:05.488758 sshd[5516]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:05.498512 systemd-logind[1480]: New session 13 of user core. Jan 17 13:45:05.505833 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 17 13:45:06.248287 sshd[5516]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:06.255542 systemd[1]: sshd@10-10.230.9.254:22-139.178.68.195:46868.service: Deactivated successfully. Jan 17 13:45:06.259198 systemd[1]: session-13.scope: Deactivated successfully. Jan 17 13:45:06.261037 systemd-logind[1480]: Session 13 logged out. Waiting for processes to exit. Jan 17 13:45:06.262850 systemd-logind[1480]: Removed session 13. Jan 17 13:45:11.407419 systemd[1]: Started sshd@11-10.230.9.254:22-139.178.68.195:49980.service - OpenSSH per-connection server daemon (139.178.68.195:49980). Jan 17 13:45:12.114225 kubelet[2728]: I0117 13:45:12.113020 2728 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 17 13:45:12.339255 sshd[5550]: Accepted publickey for core from 139.178.68.195 port 49980 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:12.341792 sshd[5550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:12.351536 systemd-logind[1480]: New session 14 of user core. Jan 17 13:45:12.358403 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 17 13:45:13.121344 sshd[5550]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:13.131047 systemd[1]: sshd@11-10.230.9.254:22-139.178.68.195:49980.service: Deactivated successfully. Jan 17 13:45:13.134898 systemd[1]: session-14.scope: Deactivated successfully. Jan 17 13:45:13.136381 systemd-logind[1480]: Session 14 logged out. Waiting for processes to exit. Jan 17 13:45:13.138582 systemd-logind[1480]: Removed session 14. Jan 17 13:45:13.280353 systemd[1]: Started sshd@12-10.230.9.254:22-139.178.68.195:49986.service - OpenSSH per-connection server daemon (139.178.68.195:49986). Jan 17 13:45:14.207623 sshd[5575]: Accepted publickey for core from 139.178.68.195 port 49986 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:14.211053 sshd[5575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:14.219137 systemd-logind[1480]: New session 15 of user core. Jan 17 13:45:14.226443 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 17 13:45:15.110171 sshd[5575]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:15.119043 systemd[1]: sshd@12-10.230.9.254:22-139.178.68.195:49986.service: Deactivated successfully. Jan 17 13:45:15.122939 systemd[1]: session-15.scope: Deactivated successfully. Jan 17 13:45:15.126294 systemd-logind[1480]: Session 15 logged out. Waiting for processes to exit. Jan 17 13:45:15.128869 systemd-logind[1480]: Removed session 15. Jan 17 13:45:15.262614 systemd[1]: Started sshd@13-10.230.9.254:22-139.178.68.195:48094.service - OpenSSH per-connection server daemon (139.178.68.195:48094). Jan 17 13:45:16.193007 sshd[5586]: Accepted publickey for core from 139.178.68.195 port 48094 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:16.195400 sshd[5586]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:16.204096 systemd-logind[1480]: New session 16 of user core. Jan 17 13:45:16.211423 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 17 13:45:16.915632 sshd[5586]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:16.923044 systemd[1]: sshd@13-10.230.9.254:22-139.178.68.195:48094.service: Deactivated successfully. Jan 17 13:45:16.926365 systemd[1]: session-16.scope: Deactivated successfully. Jan 17 13:45:16.928285 systemd-logind[1480]: Session 16 logged out. Waiting for processes to exit. Jan 17 13:45:16.930055 systemd-logind[1480]: Removed session 16. Jan 17 13:45:22.074699 systemd[1]: Started sshd@14-10.230.9.254:22-139.178.68.195:48108.service - OpenSSH per-connection server daemon (139.178.68.195:48108). Jan 17 13:45:22.987825 sshd[5625]: Accepted publickey for core from 139.178.68.195 port 48108 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:22.991273 sshd[5625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:23.000090 systemd-logind[1480]: New session 17 of user core. Jan 17 13:45:23.009410 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 17 13:45:23.220616 systemd[1]: run-containerd-runc-k8s.io-669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271-runc.pBIQ28.mount: Deactivated successfully. Jan 17 13:45:23.728619 sshd[5625]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:23.733983 systemd[1]: sshd@14-10.230.9.254:22-139.178.68.195:48108.service: Deactivated successfully. Jan 17 13:45:23.736831 systemd[1]: session-17.scope: Deactivated successfully. Jan 17 13:45:23.738090 systemd-logind[1480]: Session 17 logged out. Waiting for processes to exit. Jan 17 13:45:23.739993 systemd-logind[1480]: Removed session 17. Jan 17 13:45:28.897333 systemd[1]: Started sshd@15-10.230.9.254:22-139.178.68.195:51966.service - OpenSSH per-connection server daemon (139.178.68.195:51966). Jan 17 13:45:29.832874 sshd[5670]: Accepted publickey for core from 139.178.68.195 port 51966 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:29.837844 sshd[5670]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:29.849604 systemd-logind[1480]: New session 18 of user core. Jan 17 13:45:29.855567 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 17 13:45:30.606897 sshd[5670]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:30.614043 systemd-logind[1480]: Session 18 logged out. Waiting for processes to exit. Jan 17 13:45:30.615392 systemd[1]: sshd@15-10.230.9.254:22-139.178.68.195:51966.service: Deactivated successfully. Jan 17 13:45:30.618872 systemd[1]: session-18.scope: Deactivated successfully. Jan 17 13:45:30.620378 systemd-logind[1480]: Removed session 18. Jan 17 13:45:35.761564 systemd[1]: Started sshd@16-10.230.9.254:22-139.178.68.195:59856.service - OpenSSH per-connection server daemon (139.178.68.195:59856). Jan 17 13:45:36.694232 sshd[5683]: Accepted publickey for core from 139.178.68.195 port 59856 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:36.697303 sshd[5683]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:36.705957 systemd-logind[1480]: New session 19 of user core. Jan 17 13:45:36.712468 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 17 13:45:37.429271 sshd[5683]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:37.437637 systemd[1]: sshd@16-10.230.9.254:22-139.178.68.195:59856.service: Deactivated successfully. Jan 17 13:45:37.440422 systemd[1]: session-19.scope: Deactivated successfully. Jan 17 13:45:37.441548 systemd-logind[1480]: Session 19 logged out. Waiting for processes to exit. Jan 17 13:45:37.443680 systemd-logind[1480]: Removed session 19. Jan 17 13:45:37.590578 systemd[1]: Started sshd@17-10.230.9.254:22-139.178.68.195:59868.service - OpenSSH per-connection server daemon (139.178.68.195:59868). Jan 17 13:45:38.511230 sshd[5696]: Accepted publickey for core from 139.178.68.195 port 59868 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:38.513533 sshd[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:38.526603 systemd-logind[1480]: New session 20 of user core. Jan 17 13:45:38.532772 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 17 13:45:39.637350 sshd[5696]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:39.645284 systemd-logind[1480]: Session 20 logged out. Waiting for processes to exit. Jan 17 13:45:39.646937 systemd[1]: sshd@17-10.230.9.254:22-139.178.68.195:59868.service: Deactivated successfully. Jan 17 13:45:39.653346 systemd[1]: session-20.scope: Deactivated successfully. Jan 17 13:45:39.659271 systemd-logind[1480]: Removed session 20. Jan 17 13:45:39.793544 systemd[1]: Started sshd@18-10.230.9.254:22-139.178.68.195:59870.service - OpenSSH per-connection server daemon (139.178.68.195:59870). Jan 17 13:45:40.739216 sshd[5707]: Accepted publickey for core from 139.178.68.195 port 59870 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:40.742342 sshd[5707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:40.752561 systemd-logind[1480]: New session 21 of user core. Jan 17 13:45:40.760042 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 17 13:45:44.480074 sshd[5707]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:44.490751 systemd[1]: sshd@18-10.230.9.254:22-139.178.68.195:59870.service: Deactivated successfully. Jan 17 13:45:44.494934 systemd[1]: session-21.scope: Deactivated successfully. Jan 17 13:45:44.497305 systemd-logind[1480]: Session 21 logged out. Waiting for processes to exit. Jan 17 13:45:44.499723 systemd-logind[1480]: Removed session 21. Jan 17 13:45:44.635158 systemd[1]: Started sshd@19-10.230.9.254:22-139.178.68.195:59874.service - OpenSSH per-connection server daemon (139.178.68.195:59874). Jan 17 13:45:45.578614 sshd[5726]: Accepted publickey for core from 139.178.68.195 port 59874 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:45.582285 sshd[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:45.593304 systemd-logind[1480]: New session 22 of user core. Jan 17 13:45:45.600414 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 17 13:45:46.615860 sshd[5726]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:46.622951 systemd[1]: sshd@19-10.230.9.254:22-139.178.68.195:59874.service: Deactivated successfully. Jan 17 13:45:46.627456 systemd[1]: session-22.scope: Deactivated successfully. Jan 17 13:45:46.628658 systemd-logind[1480]: Session 22 logged out. Waiting for processes to exit. Jan 17 13:45:46.630359 systemd-logind[1480]: Removed session 22. Jan 17 13:45:46.771591 systemd[1]: Started sshd@20-10.230.9.254:22-139.178.68.195:54020.service - OpenSSH per-connection server daemon (139.178.68.195:54020). Jan 17 13:45:47.662219 sshd[5737]: Accepted publickey for core from 139.178.68.195 port 54020 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:47.664542 sshd[5737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:47.672538 systemd-logind[1480]: New session 23 of user core. Jan 17 13:45:47.679399 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 17 13:45:48.389834 sshd[5737]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:48.395676 systemd-logind[1480]: Session 23 logged out. Waiting for processes to exit. Jan 17 13:45:48.396144 systemd[1]: sshd@20-10.230.9.254:22-139.178.68.195:54020.service: Deactivated successfully. Jan 17 13:45:48.399027 systemd[1]: session-23.scope: Deactivated successfully. Jan 17 13:45:48.400500 systemd-logind[1480]: Removed session 23. Jan 17 13:45:53.167634 systemd[1]: run-containerd-runc-k8s.io-669e22c5f197bf9e965b7ddd5cb2984e13e687493d5427958715fbf85805b271-runc.OuZaD3.mount: Deactivated successfully. Jan 17 13:45:53.547577 systemd[1]: Started sshd@21-10.230.9.254:22-139.178.68.195:54036.service - OpenSSH per-connection server daemon (139.178.68.195:54036). Jan 17 13:45:54.464013 sshd[5795]: Accepted publickey for core from 139.178.68.195 port 54036 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:45:54.466608 sshd[5795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:45:54.476642 systemd-logind[1480]: New session 24 of user core. Jan 17 13:45:54.481446 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 17 13:45:55.189815 sshd[5795]: pam_unix(sshd:session): session closed for user core Jan 17 13:45:55.194664 systemd-logind[1480]: Session 24 logged out. Waiting for processes to exit. Jan 17 13:45:55.195958 systemd[1]: sshd@21-10.230.9.254:22-139.178.68.195:54036.service: Deactivated successfully. Jan 17 13:45:55.198373 systemd[1]: session-24.scope: Deactivated successfully. Jan 17 13:45:55.200984 systemd-logind[1480]: Removed session 24. Jan 17 13:46:00.355775 systemd[1]: Started sshd@22-10.230.9.254:22-139.178.68.195:60208.service - OpenSSH per-connection server daemon (139.178.68.195:60208). Jan 17 13:46:01.308074 sshd[5818]: Accepted publickey for core from 139.178.68.195 port 60208 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:46:01.312478 sshd[5818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:46:01.323794 systemd-logind[1480]: New session 25 of user core. Jan 17 13:46:01.331414 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 17 13:46:02.310389 sshd[5818]: pam_unix(sshd:session): session closed for user core Jan 17 13:46:02.315710 systemd[1]: sshd@22-10.230.9.254:22-139.178.68.195:60208.service: Deactivated successfully. Jan 17 13:46:02.322356 systemd[1]: session-25.scope: Deactivated successfully. Jan 17 13:46:02.325879 systemd-logind[1480]: Session 25 logged out. Waiting for processes to exit. Jan 17 13:46:02.328974 systemd-logind[1480]: Removed session 25. Jan 17 13:46:07.479758 systemd[1]: Started sshd@23-10.230.9.254:22-139.178.68.195:34882.service - OpenSSH per-connection server daemon (139.178.68.195:34882). Jan 17 13:46:08.371308 sshd[5836]: Accepted publickey for core from 139.178.68.195 port 34882 ssh2: RSA SHA256:2N50fYWfY163AdiG7NRM3ykUxch21WHvePJMC9c47mU Jan 17 13:46:08.373920 sshd[5836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 17 13:46:08.383996 systemd-logind[1480]: New session 26 of user core. Jan 17 13:46:08.388464 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 17 13:46:09.094151 sshd[5836]: pam_unix(sshd:session): session closed for user core Jan 17 13:46:09.099276 systemd[1]: sshd@23-10.230.9.254:22-139.178.68.195:34882.service: Deactivated successfully. Jan 17 13:46:09.099974 systemd-logind[1480]: Session 26 logged out. Waiting for processes to exit. Jan 17 13:46:09.102720 systemd[1]: session-26.scope: Deactivated successfully. Jan 17 13:46:09.105034 systemd-logind[1480]: Removed session 26.